An easy step by step tutorial on how to extract Actual Time Series using C# SDK.
Artesian gives you straightforward access to the data history to perform an analysis and produce a plan in the most compatible way.
Let’s see step-by-step how to proceed.
The goal
Extract the data of an Actual Time Series Market Data.
The reference data is fictitious, created exclusively for this case. With Artesian, it is possible to write any data attributable to a Time Series, making it suitable for saving your data production.
Let’s see how to proceed step by step.
Import of Artesian libraries and configuration
The first thing to do in order to use all the features of Artesian is to download the Artesian SDK from NuGet.
Once installed, and imported the necessary libraries (lines 1-3), we can configure Artesian by entering the essential link and API key (line 5).
To extract these two essential data, refer to the reference tutorial “How to install Artesian C# SDK”.
We can optionally also set a custom Policy Configuration, it can be useful if you want to define a different behavior from the default one for retries or the number of parallel calls that the SDK makes.
The default configuration is, itself, the optimal one so it is recommended not to change it.
After configuring Artesian and an eventual Policy, we can configure the Query Service (line 15) and proceed with the writing of data.
using Artesian.SDK.Dto;
using Artesian.SDK.Service;
using NodaTime;
ArtesianServiceConfig cfg = new ArtesianServiceConfig(
new Uri("https://arkive.artesian.cloud/{tenantName}/"), "{api-key}");
ArtesianPolicyConfig policy = new ArtesianPolicyConfig();
#policy
# .RetryPolicyConfig(retryCount: 3, retryWaitTime: 200)
# .CircuitBreakerPolicyConfig(maxExceptions: 2, durationOfBreak: 3)
# .BulkheadPolicyConfig(maxParallelism: 10, maxQueuingActions: 15);
var qs = new QueryService(cfg, policy);
La creazione dell'estrazione Actual
Once we have configured Artesian and the Query Service, we can start thinking about what data and how we want to extract it.
The basic information from Artesian is the ID or a list of IDs relating to the Market Data of interest obtained through the UI.
Fundamental parameters to decide:
Once we decide on the IDs we are interested in extracting; we can begin to evaluate how we want to extract them. The fundamental parameters to be determined are:
The Time Range of extraction: Artesian offers various possibilities; for each, you must consider that the time reference at the end of the extraction is always exclusive. For this specific example, let’s consider the AbsoluteDateRange (“2022-03-01”, “2022-03-02”).
The data extraction TimeZone: is selected according to your interest. Artesian will take care of converting the data if necessary.
The Granularity of data extraction can be coincident with the original data or different as long as an Aggregation Rule has been configured on the curve.
The Aggregation Rule: is the Artesian feature that allows you to extract data even on different granularities from the original one. The aggregation/disaggregation operation applied to the data is defined through the enhancement of this property. The possible options are “Undefined”, “SumAndDivide” or “AverageAndReplicate”. In the case of “Undefined”, extracting data at different granularities from the original will not be possible.
With our chosen extraction parameters, it is ready to be launched, and the data obtained can be viewed.
var actualTimeSeries = await qs.CreateActual()
.ForMarketData(new int[] { 100000492, 100000496 })
.InGranularity(Granularity.Day)
.InAbsoluteDateRange(new LocalDate(2018,08,01),new LocalDate(2018,08,10))
.ExecuteAsync();
return actualTimeSeries.ToList();
Other options for data extraction
Regarding the selection of the extraction ranges, Artesian supports the following options:
“AbsoluteDateRange“: an absolute fixed period of time (eg: from “2018-08-01” to “2018-08-13” will allow you to extract data from “2018-08-01” to “2018-08-12 “).
“RelativePeriod“: represents a relative period of time, before or after today (e.g., Considering that today is “2021-03-31”, requesting the period “P-5D” will mean extracting the data from “2021-03-26 “To” 2021-03-30 “. Requesting the period” P5D “will mean extracting the data from” 2021-03-31 “to” 2021-04-04 “). For the syntax, it is possible to refer to the ISO8601 standard; in addition to the simple “RelativePeriod”, it is possible to use the “RelativePeriodRange” (e.g., from “P-5D” to “P5D” it will extract the data from “2021-03-26” to “2021-04-04”).
“RelativeInterval“: is a fixed dimension “rolling” time span. The possible options are: “RollingWeek”, “RollingMonth”, “RollingQuarter” or “RollingYear” or the last 7, 30, 90, 365 days of data (with the current day included); “WeekToDate”, “MonthToDate”, “QuarterToDate” or “YearToDate” or considering from the current day to the beginning of the week, month or year.
.InAbsoluteDateRange(new LocalDate(2018,08,01),new LocalDate(2018,08,10)
.InRelativeInterval(RelativeInterval.RollingMonth)
.InRelativePeriod(Period.FromDays(5))
.InRelativePeriodRange(Period.FromWeeks(2), Period.FromDays(20))
In addition to those previously mentioned, Artesian also offers the possibility to apply a filling strategy to the data to manage any missing data. The possible options are:
FillNull (): a default operation that also returns empty values (null) in the extraction.
FillNone (): an operation that does not return empty values (null) in the extraction.
FillLatestValue (“P5D” ): an operation that returns in the extraction the last available value to the period indicated in the call, in this case, “5” days back.
FillCustomValue (): an operation that applies a custom value in the extraction instead of missing values (null). The values will apply to the settlement, the opening and/or closing price, the highest and/or lowest price, the volume paid, the volume sold and/or the volume.
.WithFillNull()
.WithFillNone()
.WithLFillLatestValue(Period.FromDays(7))
.WithFillCustomValue(123)
Alternative to the SDK extraction
As an alternative to SDK extraction, we can extract data directly from the portal in Excel format.
It is sufficient to employ just once and then have it entirely reproducible and automated in our workflow.
Not only does it save you time, but it allows you to minimize human errors caused by repeated operations on substantial amounts of data or different Excel files.
An undeniable advantage that allows us to focus on data analysis instead of its management and optimization.