Uploading custom data to TrendSpider

You can upload your custom time series data to TrendSpider. This is extremely useful when you happen to have the data we don't have. That can start from results of your very own computations, or your trusted fundamental data, or amount of cars on parking lots of Walmarts per day. You name it.

To upload custom data..
- Click the Profile icon on the top right corner of the web application
- Select 'Upload Custom Data' and proceed to upload or update custom data as a symbol.

CustomData.png

Once you upload your data, you can use it wherever you can use conventional symbols like AAPL. This includes

  1. Browsing on charts — either as a main symbol, or as overlay data (via the Price Compare indicator).
  2. Custom symbols can be parts of composite symbols, too. I.e., you can upload your own time series for amount of iPhones sold per month, and then see a composite like =IPHONESALES / AAPL which will display a ratio between "phones sold" and price of Apple shares.
  3. Using in your strategies. On TrendSpider, you can have your strategies using multiple symbols at a time: one of them is the asset it trades with, and all the others can serve as additional data sources for your strategy conditions. You can use your custom data secondary symbols in the Strategy Tester. Read more in the "Using multiple symbols in one script" section in this article.
  4. All in all, TrendSpider is a vast platform. So long story short — wherever you can use a symbol like AAPL, you can use your custom symbol too. Smart Checklists, Sesonality, Dashboards and so on and so forth.

You can upload your custom data using CSV files, either via the web application interface, or via the API. Once you upload the data, we store it and you can use it anywhere on the platform.

CSV Data format

A CSV file which you upload should have 2 or 3 columns. Order or columns matters. Order of rows does not matter. Here is what the columns should be:

  1. Symbol (optional). This is how you want to call your symbol within TrendSpider. Symbol for Apple Inc. is AAPL. Custom symbols must always start from the # character and should not contain any non-alphanumeric characters. Max length of a symbol is 25 characters.
  2. Date&Time (mandatory). This column contains date (and can also contain time) for your data point. We support a broad range of formats (see the examples below). All the time stamps will be assumed to be in America/New_York time zone. If a date&time record is missing time completely, then we'll assume time being equal to the "market session start" at a given date.
  3. Value (mandatory). This one should be a numeric value. Exponential notation is allowed.

Overall, our data uploader is very forgiving and flexible. It trims excessive spaces in all the columns, goes an extra mile or two trying to parse everything, but in the end simply ignores lines which could not be parsed. Once uploading succeeds, you'll see a notification explaining how many rows were uploaded successfully and how many (and which) were ignored.

Every file you upload can contain as many symbols and data points as you need, as long as the file stays below the overall limit of 7Mb.

Supported date formats

Examples of date formats allowed: 2 Jun 2023, 09 Aug 2021, 09-01-2022, 2020-09-03, 11/03/2020. Time is supported as HH:mm. Date&time value must contain date, optionally followed by a time stamp. Combined formats supported can be any combinations of the above, or 2023-03-06T22:00:04.647Z, or Unix time stamps (number of seconds since 1 Jan 1970) or JS time stamps (number of milliseconds since 1 Jan 1970).

Disregarding of your format, milliseconds are always ignored (trimmed). If your date&time format is not listed here, then simply give it a try: the odds are that it's supported as well. You can't specify the time zone in your time stamp; even if you follow ISO, we'll assume the date&time is "as of America/New_York".

CSV with 2 columns vs 3 columns

You can upload CSVs either with 3 columns or with 2 columns.

  1. If you have 3 columns, then columns must be symbol, date&time and value. If a header record of CSV is present then it will be ignored.
  2. If you have 2 columns, then columns must be date&time and value. Header record of CSV must be present in this case, and second column's title will be your Symbol name.

The following 2 examples will result in having identical data sets:

#AMERIBOR,2024-01-27,5.4334
#AMERIBOR,2024-01-28,5.4334
#AMERIBOR,2024-01-29,5.43371
timestamp,#AMERIBOR
2024-01-27,5.4334
2024-01-28,5.4334
2024-01-29,5.43371

What happens when you upload data

When you upload the CSV, we normalize all the rows, filter out invalid rows and then put this data into your personal storage.

When uploading data, we overwrite older data points of the same symbols, in case if your new file features matching symbol and time stamps. Other data points will simply be appended to a storage. Uploading a CSV is thereby incrementally updating your sent of data. Assume you upload a CSV like that:

SYMBOL1, 1 Jan 2020, 11.5
SYMBOL2, 1 Jan 2020, -40
SYMBOL1, 1 Feb 2020, 11
SYMBOL2, 1 Feb 2020, -34

After this upload, you'll have 2 symbols (SYMBOL1, SYMBOL2), with data points as described: SYMBOL1 will be [(1 Jan 2020, 11.5), (1 Feb 2020, 11)] and SYMBOL2 will be [(1 Jan 2020, -40), (1 Feb 2020, -34)]. Assume you upload yet another CSV after that:

SYMBOL1, 1 Feb 2020, 10.5
SYMBOL1, 1 Mar 2020, 5
SYMBOL2, 1 Mar 2020, -20

After that, you'll still have 2 symbols (SYMBOL1, SYMBOL2). SYMBOL1 value for 1 Feb 2020 will be corrected and will now be 10.5, and other new values will be appended. Overall, you will now have SYMBOL1 like [(1 Jan 2020, 11.5), (1 Feb 2020, 10.5), (1 Mar 2020, 5)] and SYMBOL2 will be [(1 Jan 2020, -40), (1 Feb 2020, -34), (1 Mar 2020, -20)].

There is no way for purging existing data points by uploading a file. If you want some data points removed, then you'll have to remove the entire custom symbol and upload all of its data again.

Applying market session data to your data points

Overall, we do not have any expectations for granularity of your data. It can be 1 data point per month, or 1 per week, or 1 per day, or 1 at 09:11 and then the next one after 4.5123 days. Long story short, we assume that your data points are sparse and their distribution over time is not predictable.

In order for data like that to be useful, any time you request your custom data, we sort it and apply 3 transformations

  1. "land" your data points to time stamps of candles for the requested time frame of the corresponding market session (the one you have selected when uploading data)
  2. while landing, we apply the "aggregation" operation (again, selected by you when uploading) whenever more than 1 data point falls into the same candle
  3. we extend (extrapolate) every data value of yours to the right, as far as the next data point (or the "current moment").

In case if your data points fall into periods of time when the corresponding market is closed (weekend, holidays, pre-market etc), these data points will be landed onto the first candle of the first market session after the time stamp they belong to ("aggregation" will apply in case if this time stamp already has data points).

Using API

You can use API in order to upload or delete custom data on TrendSpider. However, you can't use API for the purpose of reading this data. Any time you access this API, you need to provide a secret API key. You can find this key within the Custom Symbols Manager dialog in the web application.

export APIKEY="B1Usbr227Tr07s7Wzx5";
export HOSTNAME="https://charts.trendspider.com"

List symbols

curl "$HOSTNAME/userapi/1/data/custom_symbols/" -H "Authorization: Bearer $APIKEY"

Example response

["#CBYIELD"]

Delete a symbol

base64Encoded=$(echo "#CBYIELD" | base64);
curl -X DELETE "$HOSTNAME/userapi/1/data/custom_symbols/base64:$base64Encoded" -H "Authorization: Bearer $APIKEY"

Example response

//  upon success
{
    result: 'ok',
    rowsRemoved: 420
}

//  upon failure — i.e., no symbol like that
{ error: 'nothing_found' }

Upload a symbol from CSV file

Uploading is exotic, in a sense that it's not using a conventional "file upload" mechanic. We expect a JSON where CSV is passed as a base64-encoded string with no newlines. We don't store this content on a hard disk over the course of uploading. Body size limit is 7Mb. Uploading the files is call rate limited, to a few calls per minute.

# stock, fx, futures, crypto
targetAssetType="stock";

# last, first, min, max, average, sum, ohlc
groupingMethod="last";

# file name in your file system
fileName="./AMERIBOR.csv";

# base64 should contain no newlines
fileContentBase64=$(cat $fileName | base64 -w 0);
contentPrefix="data:text/csv;base64";

payload="{\"fileBase64\":\"$contentPrefix,$fileContentBase64\",\"targetAssetType\":\"$targetAssetType\",\"groupingMethod\":\"$groupingMethod\"}";

curl -X POST "$HOSTNAME/userapi/1/data/custom_symbols/" -H "Authorization: Bearer $APIKEY" -H 'Content-type: application/json' -d $payload;

Example response

{
    result:"ok",
    dataPointsUploaded: 1709,
    dataPointsAccepted: 1694,
    symbolsFound: 2,
    rowsSkipped: [2,26,41,63,96,115,148,202,269,378,646,1107,1108,1323,1384]
}

How and where we store your data

Every customer has their dedicated storage which stores only their data and nobody else's data. Technically speaking, we store your data in a dedicated table in a database in a cloud, keeping backups for up to 30 days. We apply industry standard protection (i.e., from SQL-based attacks) and other reasonable measures to protect your data. Your data can be accessed by a 3rd party only if your account gets compromised (i.e., somebody steals your password somehow). Even in this case, the big question will be whether they can make any sense off your custom time series data.

Limitations

There's a number of known limitations or certain non-intuitive aspects to this feature.

  1. For each market session, it's "end" moment (i.e., 16:00 for the US Stock market) is not considered to be a part of the session. So all the sessions are mathematically defined as [START, END).
  2. If you upload data for session of 24/7 (which is defined as [00:00, 23:59) UTC, then data points falling into 60 seconds from 23:59 to 00:00 UTC will be treated as belonging to the next session
  3. Custom symbols do not work in watch lists. They are supposed to display no values and to be crossed out.
  4. If you use custom data in our trading bots, then your bots might be stopped if uploading a new set of data creates entry or exit signals which are already in the past (more than 1 candle ago).
  5. At a current stage, each Custom Symbols API Key lives for 1 year and can not be regenerated before that.
  6. Scanning on Custom Symbols is currently not supported.

Examples of data

Here's an example of a file you could upload

#AMERIBOR,2024-01-27,5.4334
#AMERIBOR,2024-01-28,5.4334
#AMERIBOR,2024-01-29,5.43371
#AMERIBOR,2024-01-30,5.42456
#AMERIBOR,2024-01-31,5.43372
#AMERIBOR,2024-02-01,5.41858
#AMERIBOR,2024-02-02,5.42485

A few examples of publicly available data sources

  1. FRED, Federal Reserve Bank of St. Louis Data. 823K+ US and international time series from 114 sources. From unemployment rate and bond yield to length of railroads built in various random places.
  2. Google Trends, data illustrating popularity of various search queries on Google. Remember that CSV files it produces do need some cleaning prior to uploading. Namely, you've got to erase all the rows prior to the "header" row. Try correlating search trends for a request of "Ford Bronco" with the price action on F!
  3. Check out the Dataverse index page. Use map to navigate to a particular storage of data. (Harward Dataverse)[https://dataverse.harvard.edu/] is one of the biggest pools. Use the search box to find what you need. Note: not all the data in these storages is time series data.
  4. Various US statewide weather history data at noaa.gov. Requires transforming before uploading

The Wikipedia page for "Datasets for machine-learning research" might be a good place to start from as well.

Apr 15, 2024

Contact Us

Not finding what you're looking for? Contact Us Directly