How do i search and query based on time?

In the addFieldsStage variable, do these two substitutions to use the $dateTrunc:

(1) Substitute, "$$this.time_stamp" with:

bson.D{{"$dateTrunc", bson.D{{"date", "$$this.time_stamp"}, {"unit", "second"}, {"binSize", 60}} }}

(2) Substitute, "$latest_time_stamp" with:

bson.D{{"$dateTrunc", bson.D{{"date", "$latest_time_stamp"}, {"unit", "second"}, {"binSize", 60}} }}

That is fine. I suggest that you create a Compound Index on the fields trading_pair and time_stamp - as these are used together in the query discussed in this post. This kind of index on the Time Series Collection is referred as Secondary Index (and there is a default index created by MongoDB on the time field as part of the creation of the Time Series collection). This Secondary Index is for improving the query performance.

will the compound/secondary index be like this? { "time_stamp": 1, "trading_pair": 1 }

also here is what i have for the collection settings

	opts := options.CreateCollection().
	SetTimeSeriesOptions(options.TimeSeries().
	SetGranularity("minutes").
	SetMetaField("trading_pair").
	SetTimeField("time_stamp")).
	SetExpireAfterSeconds(2592000)

and current output in mongo express looks like this

{
    time_stamp: ISODate('2022-02-08T08:59:12.000Z'),
    trading_pair: 'REP-BTC',
    trading_disabled: false,
    status: 'online',
    price: 0.000341,
    _id: ObjectId('6202311075483463619c0723')
}

No, the fields need to be in this order: { "trading_pair": 1, "time_stamp": 1 }

i made the substitutions and have the follwoing

	period := 300000    // 5 * 60 * 1000 = 5 mins, as milli-seconds

	sortStage := bson.D{{"$sort", bson.D{{"trading_pair", 1}, {"time_stamp", -1}}}}
	groupStage := bson.D{{"$group", bson.D{{"_id","$trading_pair"}, {"latest_time_stamp", bson.D{{"$first", "$time_stamp"}}}, {"latest_price", bson.D{{"$first", "$price"}}}, {"docs", bson.D{{"$push", "$$ROOT"}}}}}}
	addFieldsStage := bson.D{{"$addFields",  bson.D{{"previous_doc", bson.D{{"$arrayElemAt",  bson.A{bson.D{{"$filter", bson.D{{"input", "$docs"}, {"cond", bson.D{{"$eq", bson.A{bson.D{{"$dateTrunc", bson.D{{"date", "$$this.time_stamp"}, {"unit", "second"}, {"binSize", 60}} }}, bson.D{{"$subtract", bson.A{bson.D{{"$dateTrunc", bson.D{{"date", "$latest_time_stamp"}, {"unit", "second"}, {"binSize", 60}} }}, period }}}}}}}}}}, 0}}} }} }}
	projectStage := bson.D{{"$project", bson.D{{"trading_pair", "$_id"}, {"_id", 0}, {"latest_time_stamp", 1}, {"latest_price", 1}, {"previous_time_stamp", "$previous_doc.time_stamp"}, {"previous_price", "$previous_doc.price"}, {"percent_change", bson.D{{"$divide", bson.A{ bson.D{{"$multiply", bson.A{ bson.D{{"$subtract", bson.A{"$latest_price", "$previous_doc.price" }}}, 100 }}}, "$previous_doc.price" }}}} }}} 
	
	cursor, err := collection.Aggregate(ctx, mongo.Pipeline{sortStage, groupStage, addFieldsStage, projectStage})

	if err = cursor.All(ctx, &results); err != nil {
		fmt.Println("cursor.All() error:", err)
	}

and still getting the follwoing

[
    {
        "latest_time_stamp": "2022-02-08T19:05:41Z",
        "previous_time_stamp": "0001-01-01T00:00:00Z",
        "trading_pair": "CRO-USD",
        "latest_price": 0.4781,
        "previous_price": 0,
        "percentage_change": 0
    },
    {
        "latest_time_stamp": "2022-02-08T19:05:41Z",
        "previous_time_stamp": "0001-01-01T00:00:00Z",
        "trading_pair": "ADA-USDC",
        "latest_price": 1.144,
        "previous_price": 0,
        "percentage_change": 0
    }
]

I have also added the index { "trading_pair": 1, "time_stamp": 1 }

You can include the input data you had used to run the query.

I am starting to get these errors

cursor.All() error: (BadValue) Executor error during getMore :: caused by :: can't $divide by zero

 2022/02/09 09:12:32 (BadValue) PlanExecutor error during aggregation :: caused by :: can't $divide by zero

Here data

[
    {
        "time_stamp": "2022-02-09T09:05:35Z",
        "trading_pair": "CLV-GBP",
        "price": 0.29,
        "volume_24_hour": 113540.26,
        "status": "online",
        "trading_disabled": false
    },
    {
        "time_stamp": "2022-02-09T09:06:54Z",
        "trading_pair": "CLV-GBP",
        "price": 0.29,
        "volume_24_hour": 113540.26,
        "status": "online",
        "trading_disabled": false
    },
    {
        "time_stamp": "2022-02-09T09:08:13Z",
        "trading_pair": "CLV-GBP",
        "price": 0.29,
        "volume_24_hour": 113540.26,
        "status": "online",
        "trading_disabled": false
    },
    {
        "time_stamp": "2022-02-09T09:09:31Z",
        "trading_pair": "CLV-GBP",
        "price": 0.29,
        "volume_24_hour": 113540.26,
        "status": "online",
        "trading_disabled": false
    },
    {
        "time_stamp": "2022-02-09T09:10:49Z",
        "trading_pair": "CLV-GBP",
        "price": 0.29,
        "volume_24_hour": 114540.26,
        "status": "online",
        "trading_disabled": false
    }
]

and result when it does not error

[
    {
        "latest_time_stamp": "2022-02-09T09:17:12Z",
        "previous_time_stamp": "0001-01-01T00:00:00Z",
        "trading_pair": "CLV-GBP",
        "latest_price": 0.29,
        "previous_price": 0,
        "percentage_change": 0
    }
]

values taken at different times, so timing can be off…just tried to grab data to show what am getting

With this data and the following query:

db.test.aggregate([
  { 
    $sort: { trading_pair: 1, time_stamp: -1 } 
  },
  { 
    $group: { 
        _id:  "$trading_pair",
        docs: { $push: "$$ROOT" }, 
        latest_time_stamp: { $first: "$time_stamp" }, 
        latest_price: { $first: "$price" }
    }
  },
  { 
    $addFields: {
        prev_doc: {
           $arrayElemAt: [
               { $filter: {
                      input: "$docs", 
                      cond: { 
                          $eq: [ 
                              { $dateTrunc: { date: "$$this.time_stamp", unit: "second", binSize: 60 } }, 
                              { $subtract: [                                    
                                    { $dateTrunc: { date: "$latest_time_stamp", unit: "second", binSize: 60 } }, 
                                    300000 
                               ] } 
                          ] 
                     }
               }}, 0
           ]
        }
    }
  },
  {
     $addFields: { prev_doc: { $ifNull: [ "$prev_doc",  { time_stamp: "none", price: "none" } ] } }
  },
  {
     $project: {
         trading_pair: "$_id",
         _id: 0,
         latest_time_stamp: 1,
         latest_price: 1,
         prev_time_stamp: "$prev_doc.time_stamp",
         prev_price: "$prev_doc.price",
         percent_change: {
             $cond: [
                  { $eq: [ "$prev_doc.price", "none" ] },
                   "none",
                   {
                       $divide: [ 
                           { $multiply: [ 
                               { $subtract: [ "$latest_price", "$prev_doc.price" ] }, 
                               100 
                           ] }, 
                           "$prev_doc.price"
                       ]
                   }
             ]
         }
    }
  }
])

The result is, as expected:

[
  {
    latest_time_stamp: ISODate("2022-02-09T09:10:49.000Z"),
    latest_price: 0.29,
    trading_pair: "CLV-GBP",
    prev_time_stamp: ISODate("2022-02-09T09:05:35.000Z"),
    prev_price: 0.29,
    percent_change: 0
  }
]


EDITED: Changed the query to not throw an error in case of previous document (the one before five mins) is not found (or not exists). In such case, the previous price, date and percent change are printed as “none”. For example,

  {
    latest_time_stamp: ISODate("2022-02-09T09:10:49.000Z"),
    latest_price: 0.29,
    trading_pair: "CLV-GBP",
    prev_time_stamp: "none",
    prev_price: "none",
    percent_change: "none"
  }

Mind sharing the golang driver query version please? I should get used the query finally after am done with this request.

Also will the empty previous date work now? was getting "previous_time_stamp": "0001-01-01T00:00:00Z"

Thanks

I don’'t have access to MongoDB server v5.x. I could temporarily get access to run the shell query, only. It wont be possible for a while to write a Golang version of the query.

Yeah i see, and there is no easy way to convert from the shell query to golang query. Many shell commands, takes some work to convert to golang version

I can DM you access to the server, it is just a temp server i stood up

@Bradley_Benjamin, here is the aggregation query and the Golang version of it. You will notice that I have made some refinements in the query. The outputs are like I had mentioned in the earlier comments.

The Aggregation query:

db.test.aggregate([
  { 
    $sort: { trading_pair: 1, time_stamp: -1 } 
  },
  { 
    $group: { 
        _id:  "$trading_pair",
        docs: { $push: "$$ROOT" }, 
        latest_time_stamp: { $first: "$time_stamp" }, 
        latest_price: { $first: "$price" }
    }
  },
  { 
    $addFields: {
        prev_doc: {
            $let: {
                 vars: {
                     filtered: {
                         $filter: {
                             input: "$docs", 
                             cond: { 
                                 $eq: [ 
                                     { $dateTrunc: { date: "$$this.time_stamp", unit: "second", binSize: 60 } }, 
                                     { $subtract: [                                    
                                         { $dateTrunc: { date: "$latest_time_stamp", unit: "second", binSize: 60 } }, 
                                         300000 
                                      ] } 
                                 ] 
                             }
                         }
                     }
                 },
                 in: {
                     $cond: [
                         { $eq: [ { $size: "$$filtered" }, 0 ] },
                         { time_stamp: "none", price: "none" },
                         { $arrayElemAt: [ "$$filtered", 0 ] }
                     ]
                 }
            }
        }
    }
  },
  {
     $project: {
         trading_pair: "$_id",
         _id: 0,
         latest_time_stamp: 1,
         latest_price: 1,
         prev_time_stamp: "$prev_doc.time_stamp",
         prev_price: "$prev_doc.price",
         percent_change: {
             $cond: [
                  { $eq: [ "$prev_doc.price", "none" ] },
                  "none",
                  {
                       $divide: [ 
                           { $multiply: [ 
                               { $subtract: [ "$latest_price", "$prev_doc.price" ] }, 
                               100 
                           ] }, 
                           "$prev_doc.price"
                        ]
                   }
              ]
         }
    }
  }
])

The Golang code:

period := 300000    // 5 * 60 * 1000 = 5 mins, in milli-seconds

sortStage := bson.D{{"$sort", bson.D{{"trading_pair", 1}, {"time_stamp", -1}}}}

groupStage := bson.D{{"$group", bson.D{{"_id","$trading_pair"}, {"latest_time_stamp", bson.D{{"$first", "$time_stamp"}}}, {"latest_price", bson.D{{"$first", "$price"}}}, {"docs", bson.D{{"$push", "$$ROOT"}}}}}}

addFieldsStage := bson.D{{"$addFields", bson.D{{"prev_doc", bson.D{{"$let", bson.D{{"vars", bson.D{{"filtered", bson.D{{"$filter", bson.D{{"input", "$docs"}, {"cond", bson.D{{"$eq", bson.A{bson.D{{"$dateTrunc", bson.D{{"date", "$$this.time_stamp"}, {"unit", "second"}, {"binSize", 60}}}}, bson.D{{"$subtract", bson.A{bson.D{{"$dateTrunc", bson.D{{"date", "$latest_time_stamp"}, {"unit", "second"}, {"binSize", 60}}}}, period }}}}}}}}}}}}}, { "in", bson.D{{"$cond", bson.A{bson.D{{"$eq", bson.A{bson.D{{"$size", "$$filtered"}}, 0}}}, bson.D{{"time_stamp", "none"}, {"price", "none"}}, bson.D{{"$arrayElemAt", bson.A{"$$filtered", 0}}}}}}}}}}}}}}

projectStage := bson.D{{"$project", bson.D{{"trading_pair", "$_id"}, {"_id", 0}, {"latest_time_stamp", 1}, {"latest_price", 1}, {"prev_time_stamp", "$prev_doc.time_stamp"}, {"prev_price", "$prev_doc.price"}, {"percent_change", bson.D{{"$cond", bson.A{bson.D{{"$eq", bson.A{"$prev_doc.price", "none"}}}, "none", bson.D{{"$divide", bson.A{ bson.D{{"$multiply", bson.A{ bson.D{{"$subtract", bson.A{"$latest_price", "$prev_doc.price" }}}, 100 }}}, "$prev_doc.price" }}}}}}}}}}

crsr, err := collection.Aggregate(ctx, mongo.Pipeline{ sortStage, groupStage, addFieldsStage, projectStage })

if err != nil {
	fmt.Println("Failed to Aggregate:")
	log.Fatal(err)
}

for crsr.Next(context.TODO()) {
	var result bson.M
	if err := crsr.Decode(&result); err != nil {
		fmt.Println("Cursor decode error:")
		log.Fatal(err)
	}
	fmt.Println(result)
}

if err := crsr.Err(); err != nil {
	fmt.Println("Cursor error:")
	log.Fatal(err)
}
1 Like

Just tested the golang code and i get this

I am still getting these errors

2022/02/09 19:04:50 (BadValue) PlanExecutor error during aggregation :: caused by :: can't $divide by zero

Failed to Aggregate:

cursor.All() error: (BadValue) Executor error during getMore :: caused by :: can't $divide by zero

With the sample data you had provided in the earlier post, I get the following outputs. The first one where there are data for the latest and the the previous data (minus 5 mins). The second data doesn’t have the previous data.

I could get the same outputs from both the shell as well as the Golang versions.

  {
    latest_time_stamp: ISODate("2022-02-09T09:10:49.000Z"),
    latest_price: 0.29,
    trading_pair: "CLV-GBP",
    prev_time_stamp: ISODate("2022-02-09T09:05:35.000Z"),
    prev_price: 0.29,
    percent_change: 0
  }

// The following without the previous data.

  {
    latest_time_stamp: ISODate("2022-02-09T09:10:49.000Z"),
    latest_price: 0.29,
    trading_pair: "CLV-GBP",
    prev_time_stamp: "none",
    prev_price: "none",
    percent_change: "none"
  }

Can you try cursor.All
Also you run against the collection i mentioned to you. It is all test data. Nothing is important on the mongodb server

this is what i have and am getting errors

cursor.All() error: (BadValue) Executor error during getMore :: caused by :: can't $divide by zero
period := 300000
sortStage := bson.D{{"$sort", bson.D{{"trading_pair", 1}, {"time_stamp", -1}}}}
groupStage := bson.D{{"$group", bson.D{{"_id","$trading_pair"}, {"latest_time_stamp", bson.D{{"$first", "$time_stamp"}}}, {"latest_price", bson.D{{"$first", "$price"}}}, {"docs", bson.D{{"$push", "$$ROOT"}}}}}}
addFieldsStage := bson.D{{"$addFields", bson.D{{"prev_doc", bson.D{{"$let", bson.D{{"vars", bson.D{{"filtered", bson.D{{"$filter", bson.D{{"input", "$docs"}, {"cond", bson.D{{"$eq", bson.A{bson.D{{"$dateTrunc", bson.D{{"date", "$$this.time_stamp"}, {"unit", "second"}, {"binSize", 60}}}}, bson.D{{"$subtract", bson.A{bson.D{{"$dateTrunc", bson.D{{"date", "$latest_time_stamp"}, {"unit", "second"}, {"binSize", 60}}}}, period }}}}}}}}}}}}}, { "in", bson.D{{"$cond", bson.A{bson.D{{"$eq", bson.A{bson.D{{"$size", "$$filtered"}}, 0}}}, bson.D{{"time_stamp", "none"}, {"price", "none"}}, bson.D{{"$arrayElemAt", bson.A{"$$filtered", 0}}}}}}}}}}}}}}
projectStage := bson.D{{"$project", bson.D{{"trading_pair", "$_id"}, {"_id", 0}, {"latest_time_stamp", 1}, {"latest_price", 1}, {"prev_time_stamp", "$prev_doc.time_stamp"}, {"prev_price", "$prev_doc.price"}, {"percent_change", bson.D{{"$cond", bson.A{bson.D{{"$eq", bson.A{"$prev_doc.price", "none"}}}, "none", bson.D{{"$divide", bson.A{ bson.D{{"$multiply", bson.A{ bson.D{{"$subtract", bson.A{"$latest_price", "$prev_doc.price" }}}, 100 }}}, "$prev_doc.price" }}}}}}}}}}
cursor, err := collection.Aggregate(ctx, mongo.Pipeline{sortStage, groupStage, addFieldsStage, projectStage})

type PercentageChange struct {
	Id                primitive.ObjectID `json:"-" bson:"-"`
	LatestTimeStamp   time.Time `json:"latest_time_stamp" bson:"latest_time_stamp"`
	PrevTimeStamp     time.Time `json:"prev_time_stamp" bson:"prev_time_stamp"`
	TradingPair       string `json:"trading_pair" bson:"trading_pair"`
	LatestPrice	      float64 `json:"latest_price" bson:"latest_price"`
	PrevPrice	      float64 `json:"prev_price" bson:"prev_price"`
	PercentageChange  float64 `json:"percentage_change" bson:"percentage_change"`
}

var results []PercentageChange

if err != nil {
	fmt.Println("Failed to Aggregate:")
	log.Fatal(err)
}
if err = cursor.All(ctx, &results); err != nil {
	fmt.Println("cursor.All() error:", err)
}

return c.JSON(results)

Is there anything am missing?

The output has six fields and struct definition has seven fields (I see you have included the Id field). I recollect from your earlier comment that you did not want the _id field as part of the output.

You probably have noted that some of the output field values can be different types; for example, in case of prev_time_stamp, prev_price and percent_change the value can be a number or a string (“none”). But, your struct defines these fields as types float64 and time.Time. This will not work due to data type differences.

In general, as you work with an application you will come across new requirements and there will be changes and challenges constantly. These can come in various forms - changes in inputs, validation, queries, outputs, tools used, etc. - these will happen all the time in an application’s life cycle (in spite of all the planning).

My suggestion is, you could post different questions for each of the issues you are facing with your application and try to get help, answers and suggestions.

percentage_change should be float64 since we dividing between float values. The _id is already omitted, i sent you my output remember?

Here is sample output i get with the struct

[
    {
        "latest_time_stamp": "2022-02-09T09:17:12Z",
        "prev_time_stamp": "0001-01-01T00:00:00Z",
        "trading_pair": "CLV-GBP",
        "latest_price": 0.29,
        "prev_price": 0,
        "percentage_change": 0
    }
]

so am not sure issue is with the struct. I will remove the _id completely, but prev_time_stamp is in ISODate format, so there is no issue there also, as you can see latest_time_stamp works fine. Also seems you pasting what is in mongodb, but i posting what is displaced in JSON output

Is there anything i need to change in my struct to reflect what is exepcted?

Instead of none can we use 0.00? which is float64? When no price difference percentage should read as 0.00 as no change, not a string value

A post was split to a new topic: I want all unique “Metric” for given “Service_host” in particular “Metric_Time” range