Introduction into our APIs
Welcome to the Productsup API documentation space!
At this moment, we have the following API documentation available for our clients, including a how-to section with documents on different integration possibilities:
- Stream API - High-performance API for product upload and content
- Product Data write
- Platform API - Account management API
- Manage projects and sites
- Get information about previous imports
- Check errors per site run
- Find information about exports and their history
- Check the status of a site
- Trigger a run for a site
- Product Data read
Unified authentication
You need different tokens to authenticate with the Platform API and the Stream API. However, some endpoints support unified authentication to simplify API usage. For now, these endpoints are:
In the future, we are planning to unify authentication for further endpoints. Your main contact at Productsup can help you get Stream Platform API access. You can also contact them to create new or additional tokens and revoke old ones.
Status Page
Our status page provides real-time information on system performance and security. You will find live and historical data on system performance. A note will be posted there if there are any interruptions in service.
You can subscribe on the page to updates and receive notifications via your preferred communication channel.
Sunset FAQ - Platform API data write endpoints
When do you plan to sunset the Platform API? We will sunset the Platform API data write endpoints used for product content upload by March 31, 2025.
Why do you plan to sunset the Platform API? The Stream API is our latest content upload API that provides enhanced speed and efficiency for data uploads compared to the Platform API upload endpoints. We want to phase out the use of the Platform API data write endpoints to provide our customers with a better content upload experience and reduce API maintenance efforts.
What is the Stream API? The Stream API is a high-performance data upload API designed to handle a high throughput or a high number of items with product changes. See Stream API for more information on the capabilities of this API and Introduction to Streams to create your first Stream and upload your products to Productsup.
What do I need to do to migrate to the Stream API? You need to implement the Stream API content upload endpoints and phase out the respective Platform API endpoints. See Migrate content upload from Platform API to Stream API for a detailed step-by-step guide to carry out a smooth transition. The guide includes different migration scenarios for your convenience. If you have any questions or need assistance during the transition, you can contact support@productsup.com.
Can I get a token for the Stream API? Yes, you can get a Stream API token for your account. You can request the token by reaching out to your main point of contact at Productsup or contacting support@productsup.com. See Stream API Authentication for more information.
How do I retrieve data from the Stream API? The Stream API does not support reading product data. You can still do it via the Platform API data read endpoints.
What do I need to know about the Stream API performance? The Stream API supports two Stream types. A suitable Stream type depends on your exact scenario. In general, we recommend using chunked streams as they benefit most from the performance boost of the Stream API. You can test different Stream types in a site within your sandbox project to better understand the performance impact. See Which is the best Stream type for your use case? for a more comprehensive guide on choosing a suitable Stream type.
Do you need anything from me to help with the migration process? If you reach out to us for help in the migration process, we would like to understand your specific needs to better support you. You can consider the answers to these three default questions and provide us with the related details when reaching out:
- How many products in total do you have in your product catalog?
- How many attributes does your product catalog have on average?
- How often do you make changes in your product catalog? Hourly or daily?
How-to guides
In this section, you can find different guides for integrating with our APIs.
Introduction to Streams
In this guide, you can learn about our Stream API and get the guidelines for the basic integration.
Prerequisites
- Read Authentication.
- Get a Personal Access Token.
- Get familiar with the terminology in our Glossary.
- Create a site for this tutorial in the Productsup platform.
- Have an HTTP client at hand.
- Our tutorial provides cURL commands you can execute in the terminal or also easily import into Postman.
Get started
Each set of instructions matches with a request and response on the right.
Step 1: Create your first stream
Request 1: Create your first stream
curl --location --request POST 'https://stream-api.productsup.com/streams' \
--header 'Authorization: Bearer <token>' \
--header 'Content-Type: application/vnd.api+json' \
--header 'Accept: application/vnd.api+json' \
--data-raw '{
"data": {
"type": "stream",
"attributes": {
"name": "Getting started tutorial My first stream",
"type": "chunked"
}
}
}'
Response 1: Create your first stream
{
"data": {
"type": "stream",
"id": "<stream_id>",
"attributes": {
"name": "Getting started tutorial My first stream",
"type": "chunked",
"createdAt": "2022-11-23T18:07:45+01:00",
"updatedAt": "2022-11-23T18:07:45+01:00"
},
"relationships": {
"account": {
"data": {
"type": "account",
"id": "<account_id>"
}
}
}
}
}
In the first request on the right, you can see how to create a Stream via our Stream API:
1. If needed, change the value of the name
attribute.
2. Leave the type
attribute chunked
for the tutorial. The chunked type is the default type. See
Stream types for more information.
In the response, you can see the values instead of the placeholders <stream_id>
and <account_id>
Step 2: List your Streams
Request 2: List your Streams
curl --location --request GET 'https://stream-api.productsup.com/streams' \
--header 'Authorization: Bearer <token>' \
--header 'Accept: application/vnd.api+json'
Response 2: List your streams
{
"data": {
"type": "stream",
"id": "<stream_id>",
"attributes": {
"name": "Getting started tutorial My first stream",
"type": "chunked",
"createdAt": "2022-11-23T18:07:45+01:00",
"updatedAt": "2022-11-23T18:07:45+01:00"
},
"relationships": {
"account": {
"data": {
"type": "account",
"id": "<account_id>"
}
}
}
}
}
After creating a Stream in Step 1, you can see which Streams are available in your account. We can list all Streams using the example request on the right. If you already have some Streams in your account, the response also lists them.
See Listing Streams for more information about pagination or listing individual Streams.
Step 3: Upload data to your Stream
Request 3: Upload your products in NDJSON format
curl --location --request POST 'https://stream-api.productsup.com/streams/<stream_id>/products' \
--header 'Authorization: Bearer <token>' \
--header 'Content-Type: application/x-ndjson' \
--header 'Accept: application/vnd.api+json' \
--data-binary @- <<EOF
{"id":"34SKJDF42DF","name":"Product 1","company":"ACME Corp.","price":100}
{"id":"475-SHIRT-XL","name":"Product 2","company":"ACME Corp.","price":15", "size":"XL"}
{"id":7824796324,"name":"Product 3","company":"ACME Corp.","price":5,"sale_price":4.5, "size":""}
EOF
Response 3: Upload your products in NDJSON format
{
"data": {
"type": "batch",
"id": "<batch_id>",
"attributes": {
"status": "uploaded",
"errorCount": 1,
"stages": {
"upload": {
"completedAt": "2022-11-25T15:40:10+01:00",
"status": "warning",
"successCount": 2,
"errorCount": 1,
"errors": [
{
"message": "Syntax error",
"occurrences": 1,
"example": {
"lineNumber": 2,
"value": "{\"id\":\"475-SHIRT-XL\",\"name\":\"Product 2\",\"company\":\"ACME Corp.\",\"price\":15\", \"size\":\"XL\"}"
}
}
]
},
"processing": null
}
}
},
"relationships": {
"stream": {
"data": {
"type": "stream",
"id": "<stream_id>"
}
}
}
}
In this step, upload three products to your newly created Stream. For the chunked Stream, the product data should use the NDJSON format. Copy the request as in the example on the right. There is a syntax error in the request body, which is a part of the tutorial.
Replace the <stream_id>
placeholder in the request URL with the Stream ID created in Step 1.
In the response, you can see the details about the status of your upload. The returned id
attribute is what we refer to as a Batch ID. It’s a unique reference to your upload. Save the Batch ID in notes for later to get more information about the upload.
This initial response directly reports any validation errors. With the help of thestatus
, successCount
, and errorCount
attributes, you can see if the upload is successful. Since we parse each product individually, partial successful uploads can occur. In Step 6, you can find a detailed explanation.
See Uploading data and API standards.
Step 4: Create a Stream API Data Source
Request 4: Create a Stream API Data Source
curl --location --request POST 'https://platform-api.productsup.io/platform/v2/sites/<site_id>/streams' \
--header 'Authorization: Bearer <token>' \
--header 'Content-Type: application/json' \
--data-raw '{
"import_type": 1,
"description": "Stream API Tutorial - First Stream API Data Source",
"stream_id": <stream_id>,
"status": "active"
}'
Response 4: Create a Stream API Data Source
{
"success": true,
"Sources": [
{
"id": <data_source_id>,
"site_id": <site_id>,
"description": "Stream API Tutorial - First Stream API Data Source",
"source": "",
"import_type": 1,
"import_id": 331,
"status": "active",
"settings": [
"stream : <stream_id>"
]
}
]
}
If you didn’t create a site in the platform or don’t have access to a created Site, you can skip Steps 4 and 5. They require some interaction with the site.
In this Step, you need to create a Steam API data source. With the help of the data source, the platform can import product data from a Stream into a site.
In the request body, you can change the description
attribute. The platform uses the description when listing data sources. You must submit a value to the stream_id
attribute to link the site to the Stream.
See Stream API Data Sources for more information.
Step 5: Import your first products
Request 5: Import your first products
curl -X POST https://platform-api.productsup.io/platform/v2/process/<site_id> \
--header "Content-Type:application/json" \
--data-raw'{
"action": "import"
}'
Response 5: Import your first products
{
"success": true,
"process_id": "<process_id>",
"estimated_duration": 26.105,
"message": "A new process has been triggered successfully!"
}
After configuring the data source, you can trigger an import process via the Platform API to import the Stream API data into your site. This API call fulfils the same purpose as the Import button in the platform.
Ensure to replace the <site_id>
placeholder with the Site ID
you are using in this tutorial.
See Processing data for more information.
Step 6: Check the status of your upload
Request 6: Check the status of your upload
curl --location --request GET 'https://stream-api.productsup.com/streams/<stream_id>/batches/<batch_id>' \
--header 'Accept: application/vnd.api+json'
Response 6: Check the status of your upload
{
"data": {
"type": "batch",
"id": "<batch_id>",
"attributes": {
"status": "processed",
"errorCount": 1,
"stages": {
"upload": {
"completedAt": "2022-11-25T15:40:10+01:00",
"status": "warning",
"successCount": 2,
"errorCount": 1,
"errors": [
{
"message": "Syntax error",
"occurrences": 1,
"example": {
"lineNumber": 2,
"value": "{\"id\":\"475-SHIRT-XL\",\"name\":\"Product 2\",\"company\":\"ACME Corp.\",\"price\":15\", \"size\":\"XL\"}"
}
}
]
},
"processing": {
"completedAt": "2022-11-25T15:50:45+01:00",
"status": "success",
"successCount": 2,
"errorCount": 0,
"errors": []
}
}
},
"relationships": {
"stream": {
"data": {
"type": "stream",
"id": "<stream_id>"
}
}
}
}
}
In this step, you can check your product upload status. The batch status endpoint lets you learn more about your Batch.
Replace the <stream_id>
and <batch_id>
parameters with the correct
values.
In the response, you can see processed
in the status
attribute for the batch indicating the completion of the batch import to the platform.
Previously, in Step 3, you could see the status uploaded
that indicated the successful upload of the batch that precedes the import.
A general errorCount
attribute lists the total number of errors occurred during both stages. Currently, we have two stages: upload
and processing
,
which contain more information about the events during each stage.
Each stage contains the status
attribute indicating the general status of all events during the stage.
The succesCount
and errorCount
attributes contain the number of successfully or unsuccessfully imported products.
The errors
attribute shows details about the products and the reason for the failure. Only successfully imported products continue to the next stage.
See Batches for more information.
Migrate content upload from Platform API to Stream API
Released in 2022, the Stream API is the newest content upload API for Productsup. It focuses on performance and scalability and replaces the Platform API content upload endpoints. All new customers use the Stream API for content upload. All customers using the Platform API for these purposes should migrate to the Stream API because Productsup plans to sunset the content upload endpoints of the Platform API by the end of March 2025. You can keep using the Platform API as the account management API for your integration with Productsup.
The Stream API provides multiple advantages over the content upload functionality of the Platform API:
- You can use newline-delimited JSON (NDJSON) (also see Wikipedia) instead of the classic JSON.
- No requirements to commit uploaded data.
- You can upload and import your data right away.
- No need to generate a Batch ID per upload request.
- No differentiation between full and delta logic. The Stream API accepts all product data you send regardless of whether it is a new product or an update of an existing product. If the received product IDs already exist, the Stream API updates the corresponding products with the new payload. If not, it creates new products.
- You can send concurrent requests via both the Stream API and the Platform API.
This section guides you through the requirements for a successful migration from the Platform API to the Stream API. To migrate to the Stream API, you need to take these steps:
- Prepare and test the Stream API integration.
- Choose a migration scenario for your Productsup site.
- Perform the migration in a live site or in a copy of a live site.
- (Optional) If you need to revert to your pre-migration setup, roll back to the Platform API.
Prepare the Stream API integration
First, you need to set up your integration with the Stream API by creating a Stream. See the section Getting started in Basic introduction to Streams to create your first Stream and upload your products.
Additional preparation for the copied live site migration scenario
If you choose to migrate using a copy of your live site, you need to use your sandbox project for copying the live site. Check if you have a sandbox project in your Productsup account. See Use your sandbox project for testing purposes for more information on the sandbox project best practices.
If you don't have a sandbox project in your account, reach out to your Customer Success Manager to have it added to the list of your projects.
Choose a migration scenario
You can perform the migration by setting up your Stream API integration in one of these locations:
- In a copy of your live site.
- In the live site itself.
Both scenarios have their advantages and disadvantages.
If you choose to migrate using a copy of your live site: | If you choose to migrate using your live site: |
---|---|
You reduce the risk of accidentally sending incorrect test data to your export channels. | You risk accidentally sending incorrect test data to your export channels. |
You can take the time to verify and test your import process in the copied site without pausing your live site. This means your live site can keep sending data to your export channels using the pre-migration setup until you test all the changes in the copied site and implement them in your live site. | You need to pause your live site and stop sending data to the export channels, while you set up and test the new Stream API integration. |
You need to make all the migration-related changes twice. First, in the copied site, and then, in the live site. | You need to make all the migration-related changes only once. |
If the migration process goes wrong, both scenarios let you roll back to your pre-migration setup.
Use a copy of a live site
To set up your Stream API integration in a copy of a live site, do the following:
Create one Stream for your live data and another one for your sandbox data. See the section Getting started in Basic introduction to Streams.
Log in to your Productsup account and go to the sandbox project to add a new site. See Copy a site for instructions.
To prevent the copied site from sending test data to your pre-configured destinations, you should opt out of using the settings Copy with Export A/B Destinations and Copy with Scheduling & Triggering. If you use these options, you must ensure your copied site doesn't export any data to the copied destinations and doesn't run according to the copied schedules and triggers. To do so, you should immediately pause the copied site or deactivate its destinations, schedules, and triggers. See Disable or pause a site and Activate or deactivate an export for instructions. See Set up scheduling and triggering processes in Scheduling for deleting schedules and triggers.When you successfully copy a site, the platform takes you to the Data Sources page in the newly created site. If it does, proceed to the next step.
- If the platform doesn’t take you to the newly created site, an error must have occurred. You can find out more in the displayed error message.
- If the platform takes you to the Dashboard or any other location in the newly created site, go to Data Sources from the site's main menu.
Select ADD DATA SOURCE to add and set up the Productsup Stream API data source. Choose your sandbox stream in Stream. See Import data via the Productsup Stream API for instructions.
Once the platform saves your new data source and goes back to the Data Sources page, ensure that the Productsup Stream API data source is active by checking if the play icon (▶︎) is highlighted in the State column.
Deactivate or delete the Productsup Platform API data source:
- Deactivate it by selecting the pause icon (⏸) in the State column.
- See Remove a data source to delete it.
Select Import in the top-right corner of the page to import your data. Once the site finishes the import, go to Data View or Dataflow to check the imported data in your sandbox site. If the imported data doesn't meet your expectations, review your Stream API integration, check that all requests are correct, and make sure you send all the required data to Productsup.
Once the imported data meets your expectations, go to Dashboard in your live site and select Paused in the status drop-down menu next to the Import button.
Copy all the migration-related settings from your sandbox site into your live site.
Go to Data Sources and select the pause icon (⏸) next to the Productsup Platform API data source.
When your live site has all the needed migration-related changes and the imported data meets your expectations, go to Dashboard and select Active in the status drop-down menu next to the Import button.
Now, you can send your data to the export channels:
- If your site has schedules or triggers, you can let the platform start the first export automatically at a pre-configured time.
- If your site doesn't have any schedules or triggers, select Run or Export in the top-right corner of the page.
Use the live site itself
To set up your Stream API integration in a live production site, do the following:
Create a Stream for your live data. See the section Getting started in Basic introduction to Streams.
Log in to your Productsup account, go to the needed project, and open the relevant live site.
Set the site status to Paused using the state drop-down menu in the top-right corner of the page next to the Import button. When you pause a site, it can still import data, but it can't export it, which ensures you don't send any test data to your export channels.
Go to Data Sources from the site's main menu and select ADD DATA SOURCE to add and set up the Productsup Stream API data source. See Import data via the Productsup Stream API for instructions.
Once the platform saves your new data source and goes back to the Data Sources page, disable the Productsup Stream API data source by selecting the pause icon (⏸) in the State column. Your site now has Productsup Platform API and Productsup Stream API data sources. Using the State column in Data Sources, you can control which API should import data into your site.
Run your first export with data imported via the Stream API
Before running the first export with the data imported via the Stream API, you need to check that the platform imported your product data as expected. Take the following steps to get your data sources ready for the first export to your export channels:
- Pause the Productsup Platform API data source.
- Enable the Productsup Stream API data source.
- Select Import in the top-right corner of the page.
- Validate if the imported data is correct in Data View or Dataflow. While validating the data imported using the Productsup Stream API data source, you should set the site status to Paused.
- When the imported data meets your expectations, go to Dashboard and select Active in the status drop-down menu next to the Import button. Otherwise, the platform can't export your data.
- Now, you can send your data to the export channels:
- If your site has schedules or triggers, you can let the platform start the first export automatically at a pre-configured time.
- If your site doesn't have any schedules or triggers, select Run or Export in the top-right corner of the page.
Roll back to your previous setup
If something goes wrong and the data you import using the Productsup Stream API data source doesn't meet your expectations, follow these steps to roll back to your previous Productsup Platform API setup:
- Pause the Productsup Stream API data source.
- Enable the Productsup Platform API data source.
- Select Import in the top-right corner of the page.
- Validate if the imported data is correct in Data View or Dataflow.
- Contact support@productsup.com to fix the issue with your data import and finish the migration process.
Stream API
Introduction
The Stream API is the latest addition to the APIs Productsup offers to its clients that follows JSON API standards. Streams support a high throughput or number of items that undergo product changes. Streams can leverage how clients use their data in the Productsup Platform.
Glossary
Term | Description |
---|---|
Stream | An account-bound space where you can send items via the Stream API. See STREAMS for more information. |
Batch | A collection of information about a group of products that you send to Productsup. |
Site | A location in the Productsup Platform where you import data from a Stream via Data Source. |
Product | Generic term to indicate data that is sent to the Stream API. The Productsup API does not require you to send only product information. It lets you send different types of data, as long as it follows some minimal requirements. |
Data Source | A location in the Productsup Platform where you import data from a Stream via Data Source. |
Productsup Platform | A connection between a Stream and a site. It lets clients select where to import data from. You can set this up in the Productsup platform. |
Personal Access Token | Means of authentication against our API. Also referred to as PAT. |
Status Page
Our status page provides real-time information on system performance and security. You will find live and historical data on system performance. A note will be posted there if there are any interruptions in service.
You can subscribe on the page to updates and receive notifications via your preferred communication channel.
Authentication
Your Productsup main contact can support with getting Stream API Access. Additionally they can also be contacted for creating new / additional tokens and revoking tokens.
Example authentication request (actually lists streams)
# Stream API only has cURL examples
curl --location --request GET 'https://stream-api.productsup.com/streams/124773' \
--header 'Authorization: Bearer <token>'
Stream API authentication layer is set up based on the concept of PAT. The Productsup platform links PATs to user accounts. In the future, they will let you finely control user permissions. However, there are currently no specific authorizations other than full access.
Each request you make to the Stream API needs authentication. To receive authentication, you must send an authorization header with a bearer token. The following is an example of the header format:
Authorization: Bearer <token>
API Standards
The Stream API follows the JSON API standard. This implies that all requests and responses are following the defined structure and are using the content-type application/vnd.api+json. Productsup does not accept the content-type set with the JSON API Standard to request bodies for product uploads. The reason for this is because:
- The
application/vnd.api+json
content-type has too much overhead for product data - The
application/x-ndjson
offers better performance when extracting single products from a request - The
application/json
is our legacy standard, as Productsup does not want to force existing clients to change their entire integration
The non-acceptance does not imply that Productsup never supports sending
data in application/vnd.api+json
format. However, there is a need to
understand the value first.
Compression support
Productsup supports gzip compression for product uploads to improve performance further. Use the Content-Encoding header to specify gzip compression for request payloads. Gzip compression can significantly reduce the payload size, leading to shorter upload times and better performance.
See Content-Encoding example 6 and 7 to get an example of how to use gzip compression with a product upload request with the two stream types.
Streams
This section introduces a concept called Streams. Streams are spaces where we can receive ongoing flows of structured data. Using Streams gives our clients the following possibilities:
- Importing and merging data into a single site from different systems using multiple Streams.
- Importing data from a single Stream into multiple sites in order to reuse the data without uploading it multiple times.
- Allowing the clients to create and merge Streams by themselves.
Stream creation
You can create a maximum of 100 Streams per account. If you need more Streams, reach out to your Client Success Manager.
Stream naming
Stream names should be unique, and their length can be from 3 to 255 characters.
Stream deletion
If you delete a Stream via API, it remains in our system for 7 days. During this period, you can no longer access the data of the deleted Stream, but its name remains reserved. When 7 days have passed, the system completely deletes the Stream, and its name becomes available for new Streams.
Stream formats
Streams support different structured data types. For now, you can use the following formats for the input, but we may add more formats in the future: - Classic JSON - Newline Delimited JSON (NDJSON)
Stream types: Chunked & Referenced
The Stream API supports two Stream types to handle data:
- Chunked
- Referenced
Chunked type
The chunked type provides high throughput and upload speeds.
The maximum recommended request size is 10,000 products. If the individual product size exceeds 1 MB, you should split it into chunks of up to 1 MB. Our solution automatically rejects any product exceeding 1MB.
The chunked type supports the NDJSON format, where the structure lets us keep the object deserialization simple and leverage it to process more data.
Referenced type
The referenced type fits the clients who send extremely large quantities of data.
The recommended request size is from 10 to 200 MB. You request can't be less than 10 MB.
For now, the referenced type supports the classic JSON format only.
Which is the best type for your use case?
Answer the following questions to determine which Stream type is best for you.
Do you plan to send single product updates? Single product updates mean the system will send requests for individual products.
If yes, use the chunked type.
Do you plan to send frequent product updates? Frequent product updates mean that the system pushes multiple updates per minute: single-product or batched updates.
If yes, use the chunked type.
Do you work with delta-supporting destinations? Some destinations, such as GMC or Facebook Marketplaces, support delta updates.
If yes, use the chunked type. As we continuously improve our systems, our latest end-to-end delta functionality lets you achieve a near-real-time pipeline and a much smoother integration.
Do you have a large catalog of over 10 million products and send delta updates?
If yes, use the chunked type. Our system imports delta changes faster from the chunked Streams.
Are you going to send only full catalog updates?
If yes, continue with the following questions to get a more precise recommendation:
Is your catalog size over 10 million products?
If yes, use the referenced type.
Do you import your data not more than once or twice a day?
If yes, use the referenced type.
Stream Management
The Stream Management endpoints bring independence and flexibility to our clients. They let clients integrate Stream management into their workflow.
Stream creation
Example: Create a Stream
# Stream API only has cURL examples
Example: Create a Stream in your account
curl --location --request POST 'https://stream-api.productsup.com/streams' \
--header 'Content-Type: application/vnd.api+json' \
--header 'Accept: application/vnd.api+json' \
--data-raw '{
"data": {
"type": "stream",
"attributes": {
"name": "My product stream",
"type": "chunked"
}
}
}'
Example: Create a Stream in another account you have access to
curl --location --request POST 'https://stream-api.productsup.com/streams' \
--header 'Content-Type: application/vnd.api+json' \
--header 'Accept: application/vnd.api+json' \
--data-raw '{
"data": {
"type": "stream",
"attributes": {
"name": "My product stream",
"type": "chunked"
}
},
"relationships": {
"account": {
"data": {
"type": "account",
"id": ACCOUNT_ID_OF_OTHER_ACCOUNT
}
}
}
}'
When you create a Stream, two attributes are required:
- name - arbitrary value clients can use to identify their Stream.
This is also visible in the UI when selecting a Stream in a Data Source - type - the Stream type,
chunked
orreferenced
See Stream types: chunked and referenced - relationships (optional) - this object is only relevant in case you have access to multiple accounts. With this object you can regulate in which account the Stream is created.
List Stream
Example: List all Streams
curl --location --request GET 'https://stream-api.productsup.com/streams' \
--header 'Accept: application/vnd.api+json'
Example: List Streams next page
curl --location --request GET 'https://stream-api.productsup.com/streams?page[offset]=10&page[limit]=10' \
--header 'Accept: application/vnd.api+json'
Example: List a specific Stream
curl --location --request GET 'https://stream-api.productsup.com/streams/124773' \
--header 'Accept: application/vnd.api+json'
You can either list all streams the user has access to or a specific, individual stream.
The list of all streams is paginated to a maximum of 10 streams per page. To traverse the paginated list
you can use the pagination links which can be found in the top-level links
object of the response.
We support the following query parameters for traversing:
page[offset]
- The offset determines the start recordpage[limit]
- The limit determines the number of records to display- We maintain a maximum of 10 streams per page and a minimum of 1 streams per page
Update Stream
Example: Update a specific Stream
curl --location --request PATCH 'https://stream-api.productsup.com/streams/124773' \
--header 'Content-Type: application/vnd.api+json' \
--header 'Accept: application/vnd.api+json' \
--data-raw '{
"data": {
"id": "124773",
"type": "stream",
"attributes": {
"name": "My product Stream with an updated name"
}
}
}
'
Because of technical limitations, it's not possible to change the type of Stream. In case you require changes, we advise you to create a new Stream with the correct type and remove the old Stream.
Remove Stream
Example: Remove a Stream
curl --location --request DELETE 'https://stream-api.productsup.com/streams/124773'
Remove Streams to clean up streams that are no longer needed, or if a client needs to switch between the type of a Stream. When you delete a stream, you are deleting all contained data as well. In case you removed a client(s) because of a switch between stream types, you must push the full catalog to the new Stream.
When you remove a Stream, all data inside the Stream is unrecoverable after deletion.
Site Stream Data Sources
The data source management endpoint allows you to link your streams to sites programmatically. You can create Stream API datasources and link them to sites, update them, delete or get more information about existing links
The authentication for this endpoint is unified, i.e. the new Stream API Personal Access Token authentication can be used to access functionality from the Platform API.
For more information, see the Platform API Site Stream Data Sources documentation
Uploading Data
See the chapter on Stream Type for more information on the differences between both types and also payload recommendations.
There is a single endpoint to upload data to Productsup:
/streams/{streamId}/products
Request body
Example 1: Chunked Stream - NDJSON with one product
# Stream API only has cURL examples
curl --location --request POST 'https://stream-api.productsup.com/streams/124773/products' \
--header 'Content-Type: application/x-ndjson' \
--header 'Accept: application/vnd.api+json' \
--data-raw '{"id":"34SKJDF42DF","name":"Product 1","company":"My Company"}'
Example 2: Chunked Stream - NDJSON with multiple products
curl --location --request POST 'https://stream-api.productsup.com/streams/124773/products' \
--header 'Content-Type: application/x-ndjson' \
--header 'Accept: application/vnd.api+json' \
--data-binary @- <<EOF
{"id":"34SKJDF42DF","name":"Product 1","company":"MyCompany"}
{"id":"475-SHIRT-XL","name":"Product 2","company":"My Company","size":"XL"}
{"id":7824796324,"name":"Product 3","company":"My Company","price":5,"sale_price":4.5}
EOF
Example 3: Referenced Stream - NDJSON with multiple products
curl --location --request POST 'https://stream-api.productsup.com/streams/224773/products' \
--header 'Content-Type: application/x-ndjson' \
--header 'Accept: application/vnd.api+json' \
--data-binary @- <<EOF
{"id":"34SKJDF42DF","name":"Product 1","company":"MyCompany"}
{"id":"475-SHIRT-XL","name":"Product 2","company":"My Company","size":"XL"}
{"id":7824796324,"name":"Product 3","company":"My Company","price":5,"sale_price":4.5}
EOF
Example 4: Referenced Stream - JSON with one product
curl --location --request POST 'https://stream-api.productsup.com/streams/224773/products' \
--header 'Content-Type: application/json' \
--header 'Accept: application/vnd.api+json' \
--data-raw '[{"id":"34SKJDF42DF","name":"Product1","company":"My Company"}]'
Example 5: Referenced Stream - JSON with 3 products
curl --location --request POST 'https://stream-api.productsup.com/streams/224773/products' \
--header 'Content-Type: application/json' \
--header 'Accept: application/vnd.api+json' \
--data-binary @- <<EOF [
{"id":"34SKJDF42DF","name":"Product 1","company":"MyCompany"},
{"id":"475-SHIRT-XL","name":"Product 2","company":"My Company","size":"XL"},
{"id":7824796324,"name":"Product 3","company":"My Company","price":5,"sale_price":4.5}
]
EOF
Example 6: Referenced Stream - JSON with compressed products
curl --location --request POST 'https://stream-api.productsup.com/streams/224773/products' \
--header 'Content-Type: application/json' \
--header 'Accept: application/vnd.api+json' \
--header 'Authorization: Bearer pup_token' \
--header 'Content-Encoding: gzip' \
--data-binary '@/path/to/compressed_file'
Example 7: Chunked Stream - JSON with compressed products
curl --location --request POST 'https://stream-api.productsup.com/streams/224773/products' \
--header 'Content-Type: application/x-ndjson' \
--header 'Accept: application/vnd.api+json' \
--header 'Authorization: Bearer pup_token' \
--header 'Content-Encoding: gzip' \
--data-binary '@/path/to/compressed_file'
The type of the Stream dictates the allowed payload format:
- Chunked Streams accept
application/x-ndjson
- Referenced Streams accept
application/json
orapplication/x-ndjson
Accompany every request with the correct Content-Type header. The platform does not support the mixing of different content types in a single Stream.
Productsup supports gzip compression for both types of Streams to improve performance further. Use the Content-Encoding header to specify gzip compression for request payloads.
Response depending from stream status
The stream status feature is essential for maintaining and protecting our system's performance. It lets our data source actively monitor stream status, enabling administrators and automated processes to manage streams and prevent potential system overload responsibly.
Status | Description |
---|---|
active | Default value |
disabled | Stream is disabled and rejects uploads. HTTP app returns 403 Forbidden status, and the platform doesn't trigger import. |
locked | Stream is temporarily locked. HTTP app returns 423 Locked status, and the platform prohibits data upload and import. |
locked_upload | Stream is temporarily locked for uploading. HTTP app returns 423 Locked status. |
locked_processing | Stream is temporarily locked for importing. |
Product attribute requirements
To accommodate to our clients data sets, we don't enforce a specific standard set of attributes or naming conventions.
We recommend the following rules for naming and handling attributes:
- Ideally attributes are lowercase and do not contain any spaces or special characters.
- Nested JSON structures are not supported; we recommend either:
- Flattening the object into multiple attributes
- Merge all it's values by a separator character
- When you add new attributes in future requests:
- New attributes horizontally expand existing data with empty values
- When attributes are not part of the upload anymore:
- If any existing row has a value for that attribute, the attribute remains in existence.
- Any new row uploaded without the attribute will have an empty value
ID-attribute tips
We always require that an id
-attribute is present.
This is our golden rule. The id
-attribute is a unique
identifier to your specific data row and the platform uses the
id-attribute to create new rows, apply permutations, and delete specific
rows.
Process data
The process endpoint allows you to control when data your flows in and out of the Platform. It supports triggering jobs that import and export your data, in the following combinations:
- Import: just import the data from all Data Sources into a Site
- Export: just export the data for one or all configured Channel(s)
- Combined: run an import and consecutively export for all Channels
It's important to keep the following topics in mind:
- The Process endpoint works with Sites, not Streams
- Only a single Process can run per site, we allow you to queue up to 1 additional Process
- In this case, triggering another request to the Process endpoint will result in us rejecting the request with a 429 status code
Additional information for Stream API users:
- A site should have a data source to import from your Stream.
See Platform API Process Data documentation for more information.
Enhance Stream API processing speed
Product updates
Large updates involving 5 million or more products take significant time to process. So, we advise updating your product data more frequently and on smaller scales. Frequent smaller updates are more manageable and lead to faster processing.
Full site runs
Full site runs that involve importing, processing, and exporting data require the Productsup platform to process all product updates in one go, which leads to prolonged processing times. Besides breaking down your product updates into smaller, more frequent increments, we advise using partial site runs that involve either importing or exporting data to maintain smoother operations and reduce processing times.
Continuous exports
Implementing continuous exports in the Productsup platform means that the system is constantly processing a part of product updates. We recommend running imports in between your exports to let the system handle your product updates incrementally, which ensures your product data is up-to-date and significantly speeds up your subsequent imports.
Batches
Once you send product data to the API, it responds with a status and a batch identifier or Batch ID. Use the Batch ID to check the processing status of a specific payload.
Example Note: This is stream-specific. The following example uses the chunked stream:
# Stream API only has cURL examples
curl --location --request GET 'https://stream-api.productsup.com/streams/124773/batches/b15e8bb1-bd53-470a-9597-785003536978' \
--header 'Accept: application/vnd.api+json'
We keep track of 3 overall statuses of a batch:
uploaded
- when products were fully or partially uploaded to a streamprocessed
- when products were fully or partially processed and imported to a sitefailed
- when all products from a batch were invalid either during upload or processing stage.
Each batch transits through 2 stages: upload
and processing
and each stage can be found with one of 3 statuses:
success
- no invalid products were encounteredwarning
- at least 1 invalid product was encounteredfailure
- all products were invalid
Partial uploads
When you upload a batch which contains some invalid products, a chunked
stream
accepts it as a partial upload and responds with a 202 Accepted
HTTP status code. The
property data.attributes.stages.upload
of the batch object then includes the
status of the upload stage, a number of successfully uploaded products, details of how many
invalid products were encountered, and a list of unique error messages captured with example
raw product data.
A referenced
stream does not validate products on upload and will always indicate that an upload was
either a success
or failure
.
Processing
When you run an import process, the property data.attributes.stages.processing
of the batch object
gets updated with the status of the processing stage, which also includes a number of successfully
imported products, a number of errors encountered, and a list of unique error messages captured
with example raw product data.
Invalid batch
Chunked type
Error | Response Code | Description |
---|---|---|
Empty Product Attributes | 422 | When sending a product where all attributes are empty |
Empty Payload line | 422 | When sending an empty line |
Missing "id" attribute | 422 | When the id attribute is not found in the product details |
Product payload is too large | 422 | When the payload size is too big, max line length is 2097152 bytes |
Control character error, possibly incorrectly encoded | 422 | When there is an error in data coding like missing an open quote |
Syntax error | 422 | When missing a comma between attributes for example |
Empty Payload | 422 | When the payload is empty |
Example : Empty Product Attributes of Chunked type
curl --location --request POST 'https://stream-api.productsup.com/streams/streamId/products' \
--header 'Content-Type: application/x-ndjson' \
--header 'Authorization: Bearer token' \
--data-raw '{}'
Example : Empty Payload line of Chunked type
curl --location --request POST 'https://stream-api.productsup.com/streams/streamId/products' \
--header 'Content-Type: application/x-ndjson' \
--header 'Authorization: Bearer token' \
--data-raw ' '
Example : Empty Payload of Chunked type
curl --location --request POST 'https://stream-api.productsup.com/streams/streamId/products' \
--header 'Content-Type: application/x-ndjson' \
--header 'Authorization: Bearer token' \
--data-raw ''
Referenced type
Error | Response Code | Description |
---|---|---|
Empty Payload | 422 | When the payload is empty |
Example 1: Empty Payload of Referenced type
curl --location --request POST 'https://stream-api.productsup.com/streams/streamId/products' \
--header 'Content-Type: application/json' \
--header 'Accept: application/vnd.api+json' \
--header 'Authorization: Bearer token' \
--data-raw '[{ }]'
Example 2: Empty Payload of Referenced type
curl --location --request POST 'https://stream-api.productsup.com/streams/streamId/products' \
--header 'Content-Type: application/json' \
--header 'Accept: application/vnd.api+json' \
--header 'Authorization: Bearer token' \
--data-raw '[]'
Example 3: Empty Payload of Referenced type
curl --location --request POST 'https://stream-api.productsup.com/streams/streamId/products' \
--header 'Content-Type: application/json' \
--header 'Accept: application/vnd.api+json' \
--header 'Authorization: Bearer token' \
--data-raw '[ ]'
Example 4: Empty Payload of Referenced type
curl --location --request POST 'https://stream-api.productsup.com/streams/streamId/products' \
--header 'Content-Type: application/json' \
--header 'Accept: application/vnd.api+json' \
--header 'Authorization: Bearer token' \
--data-raw ''
Delete products
We have two different endpoints that let clients delete product data:
/streams/{streamId}/products/
; supports the deletion of one, several, or all products via the request body/streams/{streamId}/products/{productId}/
; supports the deletion
of a single product via URL
Delete products via the request body
Example: Delete a single product
# Stream API only has cURL examples
curl --location --request DELETE 'https://stream-api.productsup.com/streams/124773/products' \
--header 'Content-Type: application/x-ndjson' \
--header 'Accept: application/vnd.api+json' \
--data-raw '{"id":34SKJDF42DF}'
Example: Delete multiple products
curl --location --request DELETE 'https://stream-api.productsup.com/streams/124773/products' \
--header 'Content-Type: application/x-ndjson' \
--header 'Accept: application/vnd.api+json' \
--data-binary @- <<EOF
{"id":"34SKJDF42DF"}
{"id":"475-SHIRT-XL"}
{"id":7824796324}
EOF
Example: Delete all products
curl --location --request DELETE 'https://stream-api.productsup.com/streams/124773/products?all=true' \
--header 'Accept: application/vnd.api+json'
You can delete one or several products by:
- Sending a request to the endpoint
/streams/{streamId}/products/
- Sending a request body with list of
id
-attributes for the products that you want to remove.
In case you want to delete all products, you need to add the query
parameter and value all=true
to the URL. In this case, you can omit
the request body.
Delete a product via the URL
Example: Delete a single product
curl --location --request DELETE 'https://stream-api.productsup.com/streams/124773/products/475-SHIRT-XL' \
--header 'Accept: application/vnd.api+json'
Delete single products by making a request to /streams/{streamId}/products/{productId}/
.
Import setup
You don't need any special platform knowledge to set up an import with the Stream API. But if you need help, take the following steps.
Prerequisites
To set up the import with Stream API, you need to have:
- User access to the Productsup platform
- A site in the Productsup platform ready for data import
- The Stream ID and Stream name(s).
Set up an import
- Open your site in the platform and go to Data Sources in the main menu.
- Select Add Data Source.
- In the search field, enter Stream API and find Stream API Data Source.
- Select Add to add the data source.
- Give your data source a custom name, as desired, to identify it among multiple data sources.
- Select Next and get to Data Source Configuration page.
- Select the Stream you want to import from. You can use the label Stream to find it later.
- Change the description, as needed.
- Select Save and get back to the Data Sources page.
- If your Stream already contains data, select Import in the upper-right corner to import the data.
- View your imported data in Data View.
Empty data sources
By default, the platform doesn't perform the import if a data source is empty, as it leads to a failure. But certain API scenarios may need to import an empty data source. For example, you may need to purge all data. To import an empty data source, take the following steps:
- Open your site in the platform and go to Data Sources in the main menu.
- Edit the Stream API data source.
- Go to the tab Advanced Settings.
- Go to Options.
- Check the checkbox Allow this datasource to have no products.
- Select Save Options.
To inactivate the import of an empty data source, repeat the steps above and, in the Step 6., uncheck the checkbox Allow this datasource to have no products.
Platform recommendations
Rate limiting
Different endpoints of the Stream API are subject to different rate limits. Rate limits are calculated based on the client's IP address and URL path of the accessed endpoint.
As visible in the following table, products and batches endpoints get rate limited by client's IP and by the unique stream ID from the URL. Meaning that each stream gets rate limited independently.
Endpoint | Description | Limit |
---|---|---|
/streams/{streamId}/products | Products endpoint for pushing data | 30 requests per second per {streamId} |
/streams/{streamId}/batches | Batches endpoint for reading the processing status of a batch | 100 requests per second per {streamId} |
/* | All other endpoints (the limit applies to each endpoint individually) | 5 requests per second |
Rate limit HTTP headers
The response headers of all HTTP requests sent to the Stream API show your current rate limits:
$ curl -I https://stream-api.productsup.com
HTTP/2 200
ratelimit-limit: 5
ratelimit-observed: 1
ratelimit-remaining: 4
ratelimit-reset: 1651174223
Name | Description |
---|---|
ratelimit-limit | The maximum number of requests allowed per second. |
ratelimit-observed | The number of requests remaining in the current rate limit window. |
ratelimit-reset | The time at which the current rate limit window resets in Unix time. |
When you exceed the rate limit, an error response returns:
HTTP/2 429
ratelimit-limit: 5
ratelimit-observed: 5
ratelimit-remaining: 0
ratelimit-reset: 1651174223
{"errors":[{"status":"429","title":"Too Many Requests"}]}
Platform API
Introduction
The Platform API is a REST API that provides programmatic access to account management features such as managing projects and sites, listing site errors, triggering processes, and more.
For PHP users we provide an API Client, please see the Readme.md for more information. We will accept non-breaking changes from client pull requests.
Status Page
Our status page provides real-time information on system performance and security. You will find live and historical data on system performance. A note will be posted there if there are any interruptions in service.
You can subscribe on the page to updates and receive notifications via your preferred communication channel.
Authentication
Make sure to replace
1234
andsimsalabim
with the client ID and secret you got provided.
# With shell, you can just pass the correct header with each request
curl -H "X-Auth-Token: 1234:simsalabim" https://platform-api.productsup.io/
<?php
$Client = new Productsup\Client();
$Client->id = 1234;
$Client->secret = 'simsalabim';
All requests to our API require valid authorization. The authentication token can be built with the client ID and client secret, which will be issued by our team.
The token has the following format:
client_id:client_secret
, it needs to be sent as value for the X-Auth-Token
HTTP header.
The client_id
and client_secret
are account specific, so you can only access projects,
sites and other resources which lie under your account.
Version
The API uses a URL based versioning mechanism. You can switch to the latest version by replacing v1 in the examples by v2.
If you are using the latest version Platform API Client, then you are automatically on the latest stable version.
Cursor Pagination
The API uses a cursor pagination mechanism. It supports three parameters to control the pagination and pagination is only supported for GET requests.
HTTP Request
Example of meta-attribute in the response that contains the pagination parameters
# result:
{
"success": true,
"Entities": [
{
}
],
"meta": {
"cursor": {
"current": 0,
"prev": 0,
"next": 1,
"count": 1
}
}
}
GET https://platform-api.productsup.io/platform/v2/....?limit=number1&previous=number2&cursor=number3
Cursor parameters
Name | Type | Default | Description |
---|---|---|---|
limit | integer | 50 | Maximum number of entities per page |
cursor | integer | 0 | Using an identifier to return entities that ids come after the cursor parameter (numeric value) |
previous | integer | 0 | Return the previous entity id if not return 0 (numeric value) |
Response fields
Field | Type | Description |
---|---|---|
meta | array | Indicates cursor details cursor |
Cursor fields
Field | Type | Description |
---|---|---|
current | integer | The current id of the cursor |
prev | integer | The previous entity id |
next | integer | The id of the next entity page starter |
count | integer | Number of entities returned |
Projects
Projects are used to group your Sites.
Get
Lists all or one projects of your account.
For both requests the response looks identical. Except when requesting a specific project it will just list the one project.
Requesting a list of all your projects
<?php
$projectService = new Productsup\Service\Projects($client);
$projects = $projectService->get();
print_r($projects);
/*
result will look like this:
Array
(
[0] => Productsup\Platform\Project Object
(
[id] => 1
[name] => default project
[created_at] => 2013-03-21 12:47:57
)
...
)
curl https://platform-api.productsup.io/platform/v2/projects
{
"success": true,
"Projects": [
{
"id": "1",
"name": "default project",
"created_at": "2013-03-21 12:47:57",
"links": [...]
},
...
]
}
Requesting a single project
curl https://platform-api.productsup.io/platform/v2/projects/1
{
"success": true,
"Projects": [{
"id": "1",
"name": "default project",
"created_at": "2013-03-21 12:47:57",
"links": [...]
}]
}
<?php
$rojectId = 1;
$projectService = new Productsup\Service\Projects($client);
$project = $projectService->get($projectId);
print_r($project);
/*
result will look like this:
Array
(
[0] => Productsup\Platform\Project Object
(
[id] => 1
[name] => default project
[created_at] => 2013-03-21 12:47:57
)
)
Get all projects for your account
GET https://platform-api.productsup.io/platform/v2/projects
Get a project by its identifier
GET https://platform-api.productsup.io/platform/v2/projects/<projectId>
URL parameters
Field | Type | Description |
---|---|---|
projectId | integer | Project to list |
Response fields
Field | Type | Description |
---|---|---|
success | boolean | Indicates request status |
Projects | array | List of projects |
Project fields
Field | Type | Description |
---|---|---|
id | integer | Internal ID |
name | string | Name of the project |
created_at | date | Date of creation |
links | array | List of relevant resources |
Link fields and values
Name | Description |
---|---|
self | Link to the project detail endpoint |
sites | Link to a list of sites belonging to the project |
Create
To create a new project, you can use a POST request (or the insert method).
curl -d '{"name":"test project"}' \
https://platform-api.productsup.io/platform/v2/projects
# result:
{
"success": true,
"Projects": [{
"id": 125,
"name": "test project",
"created_at": "2015-07-30 12:54:52",
"links": [...]
}]
}
<?php
$projectService = new Productsup\Service\Projects($Client);
$project = new Productsup\Platform\Project();
$project->name = "test project";
$projectService->insert($project);
print_r($result);
/**
result:
Productsup\Platform\Project Object
(
[id] => 125
[name] => test project
[created_at] => 2015-07-30 12:54:52
)
*/
Create project
POST https://platform-api.productsup.io/platform/v2/projects
HTTP headers
Name | Value |
---|---|
Content-Type | application/json |
The data to be inserted has to be provided as a JSON-Object.
Request body fields
Field | Type | Description |
---|---|---|
name | string | Name of the project |
id
and created_at
have to be empty, otherwise the values get overwritten, or the request may result in an error.
Response fields
Field | Type | Description |
---|---|---|
success | boolean | Indicates request status |
Projects | array | Details of the created project |
Project fields
See project fields
Links fields and values
See link fields
Edit
To edit an existing project, you can use a PUT request.
curl -d '{"name":"example project"}' \
https://platform-api.productsup.io/platform/v2/projects/125
# result:
{
"success": true,
"Projects": [{
"id": 125,
"name": "example project",
"created_at": "2015-07-30 12:54:52",
"links": [...]
}]
}
Update project
PUT https://platform-api.productsup.io/platform/v2/projects/<projectId>
URL parameters
Field | Type | Description |
---|---|---|
projectId | integer | Existing project that's being edited. |
HTTP headers
Name | Value |
---|---|
Content-Type | application/json |
The data to be inserted has to be provided as a JSON-Object.
Request body fields
Field | Type | Description |
---|---|---|
id | integer | Id of the existing project. |
name | string | Name of the project |
Response fields
Field | Type | Description |
---|---|---|
success | boolean | Indicates request status |
Projects | array | Details of the changed project |
Project fields
See project fields
Links fields and values
See link fields
Delete
Delete project
DELETE https://platform-api.productsup.io/platform/v2/projects/<projectId>
URL parameters
Field | Type | Description |
---|---|---|
projectId | integer | Project to delete |
Response body fields
Field | Type | Description |
---|---|---|
success | boolean | Indicates the success of the action |
curl -X DELETE https://platform-api.productsup.io/platform/v2/projects/125
# response:
{"success":true}
<?php
$projectService = new Productsup\Service\Projects($client);
$result = $projectService->delete(125); // id fetched from API
// result is true, if the delete was successful
Sites
Sites are the smallest entity, below projects and have one data source and may have several exports/channels
Get
To list all sites of your account, or only certain sites, you can use get.
Requesting a list of all your sites
<?php
// our php client builds the urls for you, but you have to set the infos to the classes:
$siteService = new \Productsup\Service\Sites($client);
// sending the actual request
$list = $siteService->get();
print_r($list);
/*
result will look like this:
Array
(
[0] => Productsup\Platform\Site Object
(
[id] => 123
[title] => site 1
[created_at] => 2015-01-01 11:22:33
[project_id] => 321
[links:protected] => Array(...)
[reference:protected] =>
)
...
*/
curl https://platform-api.productsup.io/platform/v2/sites
Requesting a list of all your sites within one project
curl https://platform-api.productsup.io/platform/v2/projects/321/sites
Requesting sites by id or tag
curl https://platform-api.productsup.io/platform/v2/sites/tagname:tagValue
curl https://platform-api.productsup.io/platform/v2/sites/123
# response:
{
"success": true,
"Sites": [
{
"id": "123",
"title": "site 1",
"created_at": "2015-01-01 11:22:33",
"project_id": "321",
"links": [...]
},
...
]
}
// or requesting it by reference/a tag:
$reference = new \Productsup\Platform\Site\Reference();
$reference->setKey('tagname');
$reference->setValue('123abc');
$list = $siteService->setReference($reference);
// or requesting the site by its id:
$list = $siteService->get(123);
/*
result will look like this:
Array
(
[0] => Productsup\Platform\Site Object
(
[id] => 123
[title] => site 1
[created_at] => 2015-01-01 11:22:33
[project_id] => 321
[links:protected] => Array(...)
[reference:protected] =>
)
...
*/
Get all sites for your account
GET https://platform-api.productsup.io/platform/v2/sites
Get all sites for a specific project
GET https://platform-api.productsup.io/platform/v2/projects/<projectId>/sites
URL parameters
Field | Type | Description |
---|---|---|
projectId | integer | Project to list sites for |
Get a site by its tag
GET https://platform-api.productsup.io/platform/v2/sites/<tagName>:<tagValue>
URL parameters
Field | Type | Description |
---|---|---|
tagName | string | Name of the tag for the site |
tagValue | string | Value of the tag for the site |
More information about references aka site tags.
Get a site by its identifier
GET https://platform-api.productsup.io/platform/v2/sites/<siteId>
URL parameters
Field | Type | Description |
---|---|---|
siteId | integer | Site to list |
Response fields
Field | Type | Description |
---|---|---|
success | boolean | Indicates request status |
Sites | array | List of sites |
Site fields
Field | Type | Description |
---|---|---|
id | string | Site identifier |
title | string | Name of the site |
status | string | List of valid statuses |
project_id | string | Identifier of the project this site belongs to |
import_schedule | string | A cron entry that sets the scheduling for data import. |
id_column | string | Name of the column that is considered as an identifier |
processing_status | string | Status of the site's latest job (Running/Done) |
created_at | date | Date of creation |
links | array | List of relevant resources |
Links fields and values
Name | Description |
---|---|
self | Link to current site detail |
tags | Link to a list of tags belonging to the site |
project | Link to project |
Site status information
Value for status | Description |
---|---|
active | The site is fully operational; data can be pushed via the API and the site will import and export |
paused_upload | The site can receive data via the API and import the data; it will however not export data |
disabled | The site will block any data send via the API, neither imports or exports can be done |
Cron entry
Our scheduling format consists of an optional timezone and one or more scheduling formats.
The timezone, if needed, should be on first line of the format, if no timezone is given our
default timezone will be used. All PHP timezones are supported.
One or more schedules can be set up per site. A newline character should be used as
a separator between timezone and schedule and between each schedule.
In case any scheduling needs to be remove and empty value can be supplied.
Example formats with explanation
TZ=Europe/Berlin # This is the timezone
H 2,6,19,22 * * 2,4,6 # Run at 02:XX, 06:XX, 16:XX and 19:XX on every Tuesday, Thursday and Saturday
H * * * * # Run at a random minute, every hour, every day
1 3,8,21 */2 * * # Run at 03:01, 08:01 and 21:01 every second day
Value formatted for submission via API:
TZ=Europe/Berlin\nH 2,6,19,22 * * 2,4,6\nH * * * *\n1 3,8,21 */2 * *
The value H
for the minute position indicates a random minute will be used. It
has our preference to configure all schedule with this value. If we have many jobs
starting at exactly the same minute, that can lead to bottlenecks in our system.
Create
To create a new site, you can use a POST request (or the insert method).
curl -d '{"title":"example site","reference":"myReferenceKey:myReference1234", "id_column": "uniqueIdentifier", "import_schedule": "8 * * * *"}' \
https://platform-api.productsup.io/platform/v2/projects/321/sites
# result:
{
"success": true,
"Sites": [{
"id": 125,
"title": "example site",
"created_at": "2015-07-30 12:54:52",
"project_id": 321,
"import_schedule": "8 * * * *",
"links": [...]
}]
}
<?php
$SitesService = new \Productsup\Service\Sites($Client);
$project = new \Productsup\Platform\Project();
$project->id = 321;
$SitesService->setProject($project);
$siteObject = new \Productsup\Platform\Site();
$siteObject->title = 'new example site';
$siteObject->id_column = 'uniqueIndetifier';
/* optional
$reference = new \Productsup\Platform\Site\Reference();
$reference->setKey('myReferenceKey');
$reference->setValue('myReference1234');
$siteObject->addReference($reference);
*/
$result = $SitesService->insert($siteObject);
print_r($result);
/**
result:
Productsup\Platform\Site Object
(
[id] => 125
[title] => new example site
[created_at] => 2015-07-30 12:54:52
[project_id] => 321
)
*/
Create site
POST https://platform-api.productsup.io/platform/v2/sites
POST https://platform-api.productsup.io/platform/v2/projects/<projectId>/sites
URL parameters
Field | Type | Description |
---|---|---|
projectId | integer | Project under which to add the site. Required unless set in request body. |
HTTP headers
Name | Value |
---|---|
Content-Type | application/json |
The data to be inserted has to be provided as a JSON-Object.
Request body fields
Field | Type | Description |
---|---|---|
title | string | Name of the site |
reference | string | Textual site reference, consisting of tagName and tagValue. This must be unique per site. |
project_id | integer | Project under which to add the site. Required unless provided in URL. |
id_column | string | ID column which is being used as an identifier when importing data to the platform. |
status | string | List of valid statuses |
import_schedule | string | A cron entry that sets the scheduling for data import. |
id
and created_at
have to be empty, otherwise the values get overwritten, or the request may result in an error.
Field id_column
is optional and only needed if the site identifier is not "id"
, by default not passing an id_column
will set the identifier to "id"
. If an empty value ""
is given for the id_column
then the identifier column will not be set.
References or site tags
A reference allows you to use a textual representation of your own choice for a site. This way you don't need to store the site ID itself.
A reference, also called site tag, consist of a tagName and tagValue. The reference format looks as follows: tagName:tagValue
.
See Tag endpoints for an example of managing tags.
Response fields
Field | Type | Description |
---|---|---|
success | boolean | Indicates request status |
Sites | array | List of sites |
Site fields
Field | Type | Description |
---|---|---|
id | integer | Site identifier |
title | string | Name of the site |
status | string | List of valid statuses |
project_id | integer | Identifier of the project this site belongs to |
import_schedule | string | A cron entry that sets the scheduling for data import. |
id_column | string | Name of the column that is considered as an identifier |
processing_status | string | Status of the site's latest job (Running/Done) |
created_at | date | Date of creation |
links | array | List of relevant resources |
Links fields and values
See link fields
Edit
To edit an existing site, you can use a PUT request as follows:
curl -d '{"id": 1, "project_id": 1, "title":"My test site", "import_schedule": "TZ=Europe/Berlin\nH 2,6,19,22 * * 2,4,6"}' \
https://platform-api.productsup.io/platform/v2/projects/1/sites/1
# result:
{
"success": true,
"Sites": [{
"id": 1,
"title": "My test site",
"created_at": "2015-07-30 12:54:52",
"project_id": 1,
"import_schedule": "TZ=Europe\/Berlin\nH 2,6,19,22 * * 2,4,6\nH * * * *",
"links": [...]
}]
}
Update site
PUT https://platform-api.productsup.io/platform/v2/sites/<siteId>
PUT https://platform-api.productsup.io/platform/v2/projects/<projectId>/sites/<siteId>
URL parameters
Field | Type | Description |
---|---|---|
projectId | integer | Project under which to edit the site. |
siteId | integer | Existing site that is being edited. |
HTTP headers
Name | Value |
---|---|
Content-Type | application/json |
The data to be inserted has to be provided as a JSON-Object.
Request body fields
Field | Type | Description |
---|---|---|
id | integer | Existing site that will be edited. |
project_id | integer | Project under which to edit the site. |
title | string | Name of the site |
id_column | string | ID column which is being used as an identifier when importing data to the platform. |
status | string | List of valid statuses |
import_schedule | string | A cron entry that sets the scheduling for data import. |
The field id_column
is optional and only needed if the site identifier needs to be set. If an empty value ""
is
given for the id_column
then the identifier column will not be set.
Response fields
Field | Type | Description |
---|---|---|
success | boolean | Indicates request status |
Sites | array | List of sites |
Site fields
Field | Type | Description |
---|---|---|
id | integer | Site identifier |
title | string | Name of the site |
status | string | List of valid statuses |
project_id | integer | Identifier of the project this site belongs to |
import_schedule | string | A cron entry that sets the scheduling for data import. |
id_column | string | Name of the column that is considered as an identifier |
processing_status | string | Status of the site's latest job (Running/Done) |
created_at | date | Date of creation |
links | array | List of relevant resources |
Links fields and values
See link fields
Delete
Delete site
DELETE https://platform-api.productsup.io/platform/v2/sites/<siteId>
URL parameters
Field | Type | Description |
---|---|---|
siteId | integer | Site to delete |
Response body fields
Field | Type | Description |
---|---|---|
success | boolean | Indicates the success of the action |
curl -X DELETE https://platform-api.productsup.io/platform/v2/sites/125
# response:
{"success":true}
<?php
$SitesService = new Productsup\Service\Sites($Client);
$result = $siteService->delete(125); // id fetched from API
// result is true, if the delete was successful
Copy
To copy a site, you can use a POST request, or the insert method.
curl --location --request POST 'https://platform-api.productsup.io/platform/v2/sites/1/copy' \
--header 'Content-Type: application/json' \
--header 'X-Auth-Token: value' \
--data-raw '{
"site_name": "example copy site",
"project_id": "2",
"copy_export": true,
"copy_schedule_and_trigger": true,
"copy_shared_setting": true
}'
# result:
{
"success": true,
"Sites": [{
"id": "2",
"title": "example copy site",
"created_at": "2023-07-27 12:54:52",
"project_id": "2",
"import_schedule": "8 * * * *",
"links": [...]
}]
}
Copy site
POST https://platform-api.productsup.io/platform/v2/sites/<siteId>/copy
URL parameters
Field | Type | Description |
---|---|---|
siteId | integer | The site which configuration you copy. |
HTTP headers
Name | Value |
---|---|
Content-Type | application/json |
Provide the inserted data as a JSON object.
Request body fields
Field | Type | Description |
---|---|---|
site_name | string | Name of the site |
project_id | string | To copy the site under a chosen project. Default value is the project id of the copied site. |
copy_export | boolean | To copy all exports configuration to the new site. Default value is false. |
copy_schedule_and_trigger | boolean | To copy all schedules and triggers to the new site. Default value is false. |
copy_shared_setting | boolean | To copy the data flow configuration to the new site. Some conditions are required. Copying shared site settings should work for source sites that are subscribed to at least one element of some other site. Meaning that particular site is a "consumer" of another site which is the "provider" of elements. The "consumer site" then could be copied. Default value is false. |
Response fields
See response fields
Site fields
See site fields
Links fields and values
See link fields
Site tags
Site tag endpoints offer our clients more independence and flexibility by letting them manage and customize their Productsup sites with tags.
List tags
Example: List all tags for a site
curl --location 'http://platform-api.productsup.io/platform/v2/sites/<SITE_ID>/tags' \
--header 'X-Auth-Token: TOKEN'
Example: List a single site tag by tag ID
curl --location 'http://platform-api.productsup.io/platform/v2/sites/<SITE_ID>/tags/<TAG_ID>' \
--header 'X-Auth-Token: TOKEN'
You can either list all tags for a site or a specific site tag. To list all tags for a site, you need to provide the site_id. To list an individual site tag, you need to provide the tag_id.
Create a tag
Example: Create a tag
curl --location 'http://platform-api.productsup.io/platform/v2/sites/<SITE_ID>/tags' \
--header 'X-Auth-Token: TOKEN' \
--header 'Content-Type: application/json' \
--data '{
"site_id": "<SITE_ID>",
"key": "<TAG_KEY>",
"value": "<TAG_VALUE>",
"readonly": 0
}'
When you create a tag, you need to specify the following attributes: - site_id contains the site ID the tag relates to. - key contains the name of the tag. - value contains the value of the tag. - readonly shows whether the tag is editable in the platform UI. If a tag is read-only, end users can't edit it in the platform.
Delete a tag
Example: Delete a tag
curl --location --request DELETE 'http://platform-api.productsup.io/platform/v2/sites/<SITE_ID>/tags/<TAG_ID>' \
--header 'X-Auth-Token: TOKEN'
If you no longer need a tag, you can delete it by providing the site_id and the tag_id.
Site Errors
With site errors, you can see the last errors or add new custom errors for a site.
Get
To list errors for one site, you can use get. See Sites for how to identify sites.
Requesting all channels of one site
<?php
$site = new \Productsup\Platform\Site();
$site->id = 123;
$errorService = new \Productsup\Service\Errors($client);
$errorService->setSite($site);
// optional params
$errorService->setParam('pid','abc456def');
$errorService->setParam('limit',1);
$errorService->setParam('offset',2);
$result = $errorService->get();
/*
result will look like this:
Array
(
[0] => Productsup\Platform\Error Object
(
[id] => 123
[pid] => abd456
[error] => 10081
[data] =>
[datetime] => 2003-11-14 00:00:00
[links:protected] => ...
)
[1] => Productsup\Platform\Error Object
(
[id] => 124
[pid] => 537df1d87c39c
[error] => 10012
[data] => {"FTP Host":"sftp:\/\/example.org","User":"sftpuser"}
[datetime] => 2003-11-15 00:00:00
[links:protected] => ...
)
...
*/
curl https://platform-api.productsup.io/platform/v2/sites/123/errors
# response:
{
"success": true,
"Errors": [
{
"id": "1802017",
"pid": "537cb0659a7dc",
"error": "10012",
"data": "{"FTP Host":"sftp:\/\/example.org","User":"sftpuser"}",
"site_id": "123",
"datetime": "2003-11-15 00:00:00",
"type": "Error",
"classification": "medium",
"links": [{...}]
},
....
]
}
HTTP Request
GET https://platform-api.productsup.io/platform/v2/sites/<siteId>/errors
GET https://platform-api.productsup.io/platform/v2/sites/<siteid>/errors?pid=<pid>&limit=<limit>&offset=<offset>
URL parameters
Name | Type | Description |
---|---|---|
siteId | integer | Site to get the errors for |
Optional query parameters
Name | Example | Default | Description |
---|---|---|---|
pid | abc456def |
(latest) | Process id, by default the latest process is shown |
limit | 10 |
50 |
Maximum number of results |
offset | 20 |
0 |
Results begin at this position |
Response fields
Field | Type | Description |
---|---|---|
success | boolean | Indicates request status |
Errors | array | List of errors |
Error fields
Field | Type | Description |
---|---|---|
id | integer | Internal identifier |
pid | string | Process identifier |
error | integer | Error identifier |
data | array | Additional information about the error |
site_id | integer | Site identifier |
message | string | End user friendly error message |
datetime | string | Date time of when the error has occurred |
type | string | Type of the error (Error, Warning, Info) |
classification | string | Severity of the error (low, medium, high) |
links | array | List of relevant resources |
Links fields and values
Name | Description |
---|---|
self | Link to the error endpoint |
Import History
With the endpoint import history, you may query for meta information about the last imports.
Get
Lists the information about your last imports.
<?php
$reference = new \Productsup\Platform\Site\Reference();
$reference->setKey(\Productsup\Platform\Site\Reference::REFERENCE_SITE);
$reference->setValue(123); // site id
$importHistory = new \Productsup\Service\ImportHistory($client);
$importHistory->setReference($reference);
$history = $importHistory->get();
print_r($history);
/*
result will look like this:
Array
(
[0] => Productsup\Platform\ImportHistory Object
(
[id] => 1111111111
[site_id] => 123
[import_time] => 2015-07-15 15:02:11
[product_count] => 18370
[pid] => null
[links:protected] => Array
(
[site] => http://api.productsup.io/platform/v2/sites/123
)
[reference:protected] =>
)
)
*/
requesting one site by its ID
curl https://platform-api.productsup.io/platform/v2/sites/123/importhistory
# response:
{
"success": true,
"Importhistory": [
{
"id": "11111111",
"site_id": 1234,
"import_time": "2015-01-01 11:22:33",
"product_count": "18370",
"pid": "47e6b828-3210-3568-8ec3-85ed3e2d944c",
"links": [...]
},
...
]
}
HTTP Request
GET https://platform-api.productsup.io/platform/v2/sites/<siteId>/importhistory
URL parameters
Field | Type | Description |
---|---|---|
siteId | integer | Site to list import history for |
Response fields
Field | Type | Description |
---|---|---|
success | boolean | Indicates request status |
Sites | array | List of imports |
Site fields
Field | Type | Description |
---|---|---|
id | integer | Internal ID |
site_id | integer | ID of the referenced site |
import_time | date | Date of the import |
product_count | integer | Total amount of imported products |
pid | string | Process ID (format: UUID 32) |
links | array | List of relevant resources |
Links fields and values
Name | Description |
---|---|
site | Link to site |
Channels
Channels are targets of the data (like "Google Shopping", Amazon,..)
Get
To list all channels of your account, or only certain sites, you can use get.
<?php
$reference = new \Productsup\Platform\Site\Reference();
$reference->setKey(\Productsup\Platform\Site\Reference::REFERENCE_SITE);
$reference->setValue(123); // site id
$channelService = new \Productsup\Service\Channels($client);
$channelService->setReference($reference);
$channels = $channelService->get();
/*
result will look like this:
Array
(
[0] => Productsup\Platform\Channel Object
(
[id] => 321
[site_id] => 123
[name] => Criteo DE
[export_name] => Criteo
[history] =>
[links:protected] => ...
)
[1] => Productsup\Platform\Channel Object
(
[id] => 543
[site_id] => 123
[name] => Zanox DE
[export_name] => Zanox
[history] =>
[links:protected] => ...
)
...
*/
Requesting all channels of one site
curl https://platform-api.productsup.io/platform/v2/sites/123/channels
Requesting a specific channel
curl https://platform-api.productsup.io/platform/v2/sites/123/channels/321
Response
{
"success": true,
"Channels": [
{
"id": "321",
"site_id": "123",
"channel_id": "111",
"name": "Criteo DE",
"export_name": "Criteo",
"links": [...]
},
{
"id": "541",
"site_id": "123",
"channel_id": "222",
"name": "Zanox DE",
"export_name": "FZanox",
"links": [...]
}
]
}
Get all channels for a site
GET https://platform-api.productsup.io/platform/v2/sites/<siteId>/channels
URL parameters
Field | Type | Description |
---|---|---|
siteId | integer | Site to list channels for |
Get a channel by its identifier
GET https://platform-api.productsup.io/platform/v2/sites/<siteId>/channels/<channelId>
URL parameters
Field | Type | Description |
---|---|---|
siteId | integer | Site under which channel exists |
channelId | integer | Channel to get; use site channel relation id |
Response fields
Field | Type | Description |
---|---|---|
success | boolean | Indicates request status |
Channels | array | List of channels |
Channel fields
Field | Type | Description |
---|---|---|
id | integer | ID of the site channel relation |
site_id | integer | ID of the referenced site |
channel_id | integer | ID of the channel |
name | string | Name of the export you provided while creating the channel |
export_name | string | Generic name of the export in the productsup system |
links | array | List of relevant resources |
Links, fields and values
Name | Description |
---|---|
self | Link to channel detail |
site | Link to site |
Channel History
With the channel history, you can get information on the last exports of a channel
Get
To list the history, you can use get.
<?php
$reference = new \Productsup\Platform\Site\Reference();
$reference->setKey(\Productsup\Platform\Site\Reference::REFERENCE_SITE);
$reference->setValue(123); // site id
$channelService = new \Productsup\Service\Channels($client);
$channelService->setReference($reference);
$channels = $channelService->get(321,'history');
/*
result will look like this:
Array
(
[0] => Productsup\Platform\Channel Object
(
[id] => 2116
[site_id] => 368693
[name] => Criteo DE
[export_name] => FusePump Criteo
[history] => Array
(
[0] => Array
(
[id] => 25190
[site_id] => 368693
[site_channel_id] => 2116
[export_time] => 2015-08-27 16:22:57
[export_start] => 2015-08-27 16:22:55
[product_count] => 20562
[product_count_now] => 20562
[product_count_previous] => 0
[process_status] => 0
[pid] => 55df182bde8e8
[product_count_new] => 0
[product_count_modified] => 0
[product_count_deleted] => 0
[product_count_unchanged] => 0
[uploaded] => 0
)
[1] => Array
(
[id] => 25163
[site_id] => 368693
[site_channel_id] => 2116
[export_time] => 2015-08-27 15:48:03
[export_start] => 2015-08-27 15:48:02
[product_count] => 20562
[product_count_now] => 20562
[product_count_previous] => 0
[process_status] => 0
[pid] => 55df10f8c89d2
[product_count_new] => 0
[product_count_modified] => 0
[product_count_deleted] => 0
[product_count_unchanged] => 0
[uploaded] => 0
)
...
*/
Requesting the history of one channel
curl https://platform-api.productsup.io/platform/v2/sites/123/channels/321/history
# response:
{
"success": true,
"Channels": [
{
"id": "321",
"site_id": "123",
"channel_id": "1",
"name": "Google Merchant Center DE",
"export_name": "Google Merchant Center",
"links": [...],
"history": [
{
"id": "333",
"site_id": "123",
"site_channel_id": "444",
"export_time": "2015-09-30 10:18:56",
"export_start": "2015-09-30 10:18:54",
"product_count": "18697",
"product_count_now": "20904",
"product_count_previous": "0",
"process_status": "0",
"pid": "560b96899e334",
"product_count_new": "0",
"product_count_modified": "0",
"product_count_deleted": "0",
"product_count_unchanged": "0",
"uploaded": "0"
},
...
]
}
]
}
HTTP Request
GET https://platform-api.productsup.io/platform/v2/sites/<siteId>/channels/<channelId>/history
URL parameters
Field | Type | Description |
---|---|---|
siteId | integer | Site under which channel exists |
channelId | integer | Channel to get; use site channel relation id |
Response fields
Field | Type | Description |
---|---|---|
success | boolean | Indicates request status |
Channels | array | List of channels |
Channel fields
Field | Type | Description |
---|---|---|
id | integer | ID of the site channel relation |
site_id | integer | ID of the referenced site |
channel_id | integer | ID of the channel |
name | string | Name of the export you provided while creating the channel |
export_name | string | Generic name of the export in the productsup system |
links | array | List of relevant resources |
history | array | List of channel history |
Links fields and values
Name | Description |
---|---|
self | Link to channel detail |
site | Link to site |
History fields
Field | Type | Description |
---|---|---|
id | integer | Internal identifier |
site_id | integer | Identifier of the referenced site |
site_channel_id | string | Internal id for the combination of an Export and Site |
export_time | dateTime | Time when the process was finished |
export_start | dateTime | Time when the process was started |
product_count | integer | Number of products exported |
pid | string | Internal identifier for the process |
product_count_new | integer | Number of new products (only for delta exports) |
product_count_modified | integer | Number of updated products (only for delta exports) |
product_count_deleted | integer | Number of deleted products (only for delta exports) |
product_count_unchanged | integer | Number of unchanged products (only for delta exports) |
uploaded | integer | Indicator if the export was uploaded to it's destination |
Product Data (Write)
Before reading this document, take note of the following:
Product format
When uploading products the following rules apply:
- Columns can be named freely, but ideally should be in lowercase and not contain any spaces or special characters. Our technology relies on SQLite databases so that's where our limits lie.
- When columns are added:
- Existing data will have an empty value for these columns.
- When columns are not uploaded (removed):
- If any existing data has a value for that column, it will remain and the new data will just have an empty value.
- If no existing data has a value, that column will be automatically removed.
- The amount of products per upload request is limited to 10000. We recommend sending multiple upload requests with the same batch id, if you reach this limit.
A list of example product data:
id | title | price | shipping | pup:isdeleted |
---|---|---|---|---|
123 | my first product | 1.23 | ||
124 | my second product | 3.21 | 0.99 | |
125 | my other product | 5.99 | - | 1 |
126 | another product of mine | 0.50 | - | -1 |
Unique identifier id
The id
column is required and should contain a unique value to represent
your product. If a duplicate value for the id
is present, we will import the
last product with the duplicate value.
When sending updates or in case a product needs to be deleted, the correct id
should be used.
Response status codes
Code | Message | Details |
---|---|---|
200 | N/A | The request was successful |
423 | Request was blocked because another data operation is already running | Concurrent uploads to the same batch id are not allowed. This needs to be done consecutively. |
Uploading
curl --header 'Content-Type: application/json' -d '[{
"id": 123,
"title": "test title",
"price": 1.23
}, {
"id": 124,
"title": "next title",
"price": 3.21,
"shipping": "0.99"
}]'
https://platform-api.productsup.io/platform/v2/sites/Identifier:123/products/md5string32chars/upload
<?php
// our php client builds the urls for you, but you have to set the info in the classes:
$ProductService = new Productsup\Service\ProductData($Client);
$Reference = new Productsup\Platform\Site\Reference();
/**
* You have to specify the site the products belong to.
* This is done by references to the site.
**/
// In case you have a productsup site id, you can pass it like this:
$Reference->setKey($Reference::REFERENCE_SITE);
$Reference->setValue(123); // Site ID
// In case you have a custom reference, you can use the follow logic
$Reference->setKey($siteTagName);
$Reference->setValue($siteTagValue);
// Assign the reference to the endpoint class
$ProductService->setReference($Reference);
// Actual creating of upload data
$ProductService->insert(array(
'id' => 123,
'price' => 1.23,
'description' => 'test title',
)
);
$ProductService->insert(array(
'id' => 124,
'price' => 3.21,
'description' => 'next title',
'shipping' => 0.99
)
);
Uploading to the API works via batches. A batch is a collection of products, potentially delivered by multiple requests. The batch can, once all product data is delivered, be committed or discarded.
HTTP Request
POST https://platform-api.productsup.io/platform/v2/sites/<siteIdentifier>/products/<batchId>/upload
URL parameters
Field | Type | Description |
---|---|---|
siteIdentifier | mixed | Either a siteId or siteTags |
batchId | string (32 characters) | Any sequence of characters which indicate a unique batch. It should be exactly 32 characters long. A possible idea is to generate a unique number and then hash it with the md5 algorithm. |
Site identifier values
Type | Data type | Description |
---|---|---|
siteId | integer | Using a site identifier (numeric value) |
siteTags | string (format: tagName:tagValue ) |
A combination of a tag name and tag value for a site, see also site tags |
HTTP headers
Name | Value |
---|---|
Content-Type | application/json |
The data to be inserted has to be provided as a JSON-Object.
Uploading and Committing Rule
Committing
curl --header 'Content-Type: application/json' -d '{"type":"full", "automatic_import":true}'
https://platform-api.productsup.io/platform/v2/sites/Identifier:123/products/md5string32chars/commit
<?php
$ProductService->setImportType(\Productsup\Service\ProductData::TYPE_DELTA);
// OR
$ProductService->setImportType(\Productsup\Service\ProductData::TYPE_FULL);
// Disable automatic import scheduling, by disabling this a process needs to be triggered via the process endpoint
$ProductService->setAutomaticImportScheduling(false);
// note: if you do not define the type the "full" is used as default
$result = $ProductService->commit();
Once you are finished with uploading all product data, you can start the processing of the data. This is done batch by batch, so it's advisable to use one batch ID per upload (even if it consists of multiple upload requests).
HTTP Request
POST https://platform-api.productsup.io/platform/v2/sites/<siteIdentifier>/products/<batchId>/commit
URL parameters
See url parameters
HTTP Headers
See HTTP headers
Request body fields
Field | Type | Description |
---|---|---|
type | string | Type of upload |
automatic_import | boolean | Whether the automatic triggering of an import & export should be scheduled. |
Automatic import & export scheduling
The default behaviour is that 20 minutes after a commit we schedule a full process (import and export). Every new commit within this time frame resets the time to 20 minutes after that commit. After the last commit the process will then be triggered, this has been default behaviour. Therefore we recommend setting the value of automatic_import to true.
Via the process endpoint it's possible to programmatically trigger a process and be more specific of the type. When implementing this, we recommend a value of false for the key automatic_import.
Type values
Value | Description |
---|---|
full | The current upload are all products for the given site, all data from past uploads will be removed. |
delta | The current upload is only a part of all your products. Use this in case you plan to send incremental uploads. |
Values should be set accordingly:
Commit value | Product Update Mode value |
---|---|
full | replace |
delta | update |
Response status codes
Code | Message | Details |
---|---|---|
200 | N/A | The request was successful |
423 | Request was blocked because another data operation is already running | Concurrent commits are not allowed, since it's a one time operation for a batch ID. So once a commit starts, we lock it until a response is given. |
Discarding
curl https://platform-api.productsup.io/platform/v2/sites/Identifier:123/products/md5string32chars/discard
<?php
$result = $ProductService->discard();
If something has gone wrong during the upload process, it is possible to cancel the whole batch. This allows you be more strict with your data integrity. This is achieved by calling the discard endpoint on a batch id.
HTTP Request
POST https://platform-api.productsup.io/platform/v2/sites/<siteIdentifier>/products/<batchId>/discard
URL parameters
See url parameters
HTTP Headers
See HTTP headers
Deleting
There are two ways to delete products in the Platform API.
Deleting all products
curl --header 'Content-Type: application/json'
--request DELETE 'https://platform-api.productsup.io/platform/v2/sites/Identifier:123/products'
It's possible to delete all product data stored in the Platform API by sending a DELETE request. Once you have send the request on the next run it will import 0 products and clear all stored data. This also allows you to start sending new requests with data after the delete request.
Deleting one or more products
## Soft delete
curl -d '[{
"id": 124,
"pup:isdeleted": 1
}]'
https://platform-api.productsup.io/platform/v2/sites/Identifier:123/products/md5string32chars/upload
## Hard delete
curl -d '[{
"id": 124,
"pup:isdeleted": -1
}]'
https://platform-api.productsup.io/platform/v2/sites/Identifier:123/products/md5string32chars/upload
<?php
// Soft delete
$ProductService->delete(array(
'id' => 123,
)
);
// Hard delete
$ProductService->insert(array(
'id' => 123,
'pup:isdeleted' => -1
)
);
Deleting specific products can be achieved by adding the column pup:isdeleted. Depending on it's value a soft or a hard delete can be triggered.
- A soft delete marks the product as deleted, but it will still show up in the Dataview of the Platform. Soft deleted products will not be exported.
- When doing a hard delete, the product will not be imported and will not be visible in the Platform.
This applies to both full and delta uploads. A full upload will also override all data, so it's not needed to remove outdated products beforehand.
Value for pup:isdeleted | Description |
---|---|
1 | Soft delete, product will be present in Platform, but marked as deleted and will not be exported |
-1 | Hard delete, product will not be present in Platform |
0 | No delete, product will be present in Platform |
Product Data (Read)
Get
curl https://platform-api.productsup.io/product/v2/site/123/stage/intermediate/
?filter=id+%3C%3E+%27%27
&limit=5000
&offset=0
&fields%5B0%5D=id
&fields%5B1%5D=gtin
&hidden=0
{
"success": true,
"products": [{
"id": "123",
"gtin": "42"
}]
}
<?php
$productData = new \Productsup\Service\ProductData($client);
$reference = new \Productsup\Platform\Site\Reference();
$reference->setKey(\Productsup\Platform\Site\Reference::REFERENCE_SITE);
$reference->setValue(123); // site id
$productData->setReference($reference);
$query = new \Productsup\Query();
$query->fields = array(
'id',
'gtin'
);
$query->filter = "id <> ''";
$query->limit = 5000;
$query->offset = 0;
$products = $productData->get(
\Productsup\Service\ProductData::STAGE_INTERMEDIATE,
0,
$query
);
result:
Array
(
[success] => 1
[products] => Array
(
[0] => Array
(
[id] => 123
[gtin] => 42
)
)
)
HTTP Request - Get product data
GET https://platform-api.productsup.io/product/v2/site/<siteId>/stage/<stageName>/<stageId>?limit=<limit>&ofset=<offset>&fields=<fields>&hidden=<hidden>&filter=<filter>
URL parameters
Field | Type | Description |
---|---|---|
siteId | integer | Site identifier |
stageName | integer | Stage name |
stageId | integer | Stage id, set to 0 for stages import and intermediate |
Stage names
The table below represents the processing stages in which the product data is available. Each stage can add transformations and filters on product data. Export and channel are quite similar in their use. However we no longer use new exports, we only create new channels.
Name | Stage description |
---|---|
import | Directly after importing the product data from an API upload |
intermediate | Generic transformations, always required for all product data, may be applied |
export | Specific export transformations are applied |
channel | Specific channel transformations are applied |
Query fields
The query fields allow a more precise control over the product data being returned. Certain requirements and filters can be set, as well as functionality to paginate through long result sets.
Name | Type | Default | Description |
---|---|---|---|
limit | integer | 5000 | Maximum number of products |
offset | integer | 0 | Offset for querying products |
fields | array | all fields | Array of fields to select |
hidden | numeric boolean (0 or 1) | 0 | If set to 1 also hidden fields (fields that are not exported) are included |
filter | string | none | Condition to filter for, in SQL syntax |
Response body fields
Field | Type | Description |
---|---|---|
success | boolean | Indicates request status |
products | array | List of product data, at least containing and id column |
<?php
// see Product Data write for more info
$ProductService = new Productsup\Service\ProductData($Client);
$Reference = new Productsup\Platform\Site\Reference();
$productData = new \Productsup\Service\ProductData($client);
$reference = new \Productsup\Platform\Site\Reference();
$reference->setKey(\Productsup\Platform\Site\Reference::REFERENCE_SITE);
$reference->setValue(123); // site id
$productData->setReference($reference);
$metaData = $productData->getProperties(\Productsup\Service\ProductData::STAGE_INTERMEDIATE,0);
/** response:
Array (
[success] => 1
[columns] => Array
(
[0] => id
[1] => gtin
[2] => price
...
)
[products] => 42
)
*/
curl https://platform-api.productsup.io/product/v2/site/123/stage/intermediate/0/properties/
result:
{
"success": true,
"columns": ["id", "gtin", "price", ...],
"products": 42
}
HTTP Request - Get product data properties
GET https://platform-api.productsup.io/product/v2/site/<siteId>/stage/<stageName>/<stageId>/properties
URL parameters
See url parameters
Response body fields
Field | Type | Description |
---|---|---|
success | boolean | Indicates request status |
columns | array | Columns of the data set |
products | integer | Total amount of products in data set |
Response status codes
Code | Message | Details |
---|---|---|
200 | N/A | The process was successfully called or scheduled |
429 | Too many attempts | The API is rate-limiting your request. See rate limiting for more information. |
The API supports this product read endpoint, but the technical characteristics of our infrastructure setup don't allow it to be a high-performing endpoint. We currently do not offer a high-performing product read endpoint. Depending on the complexity of the filter, the database size and the number of columns, requests can sometimes take a couple of minutes or more. In these cases, we ask you not to send concurrent requests. Send consecutive requests only.
Process data
The process endpoint allows you to control when data your flows in and out of the Platform. It supports triggering jobs that import and export your data, in the following combinations:
- Import: just import the data from all Data Sources into a Site
- Export: just export the data for one or all configured Channel(s)
- Combined: run an import and consecutively export for all Channels
It's important to keep the following topics in mind:
- The Process endpoint works with Sites, not Streams
- Only a single Process can run per site, we allow you to queue up to 1 additional Process
- In this case, triggering another request to the Process endpoint will result in us rejecting the request with a 429 status code
Post
Trigger a processing action on your site.
Triggering an action on your site
<?php
$processAction = new Productsup\Service\Process($Client);
// Only required if not setting the site_id property for the model
$reference = new \Productsup\Platform\Site\Reference();
$reference->setKey($reference::REFERENCE_SITE);
$reference->setValue(123456789);
$processModel = new Process();
$processModel->action = 'import';
// Only required for types 'export' or 'channel'
// $processModel->action_id = 1234567890;
$processModel->addReference($reference);
// Instead of a reference you can also set the site id manually
// $processModel->site_id = 123457890;
$result = $processAction->post($processModel);
var_dump($result);
/*
result will look like this:
bool(true)
*/
curl --location --request POST 'https://platform-api.productsup.io/platform/v2/process/<siteId>' \
--header 'Content-Type: application/json' \
--header 'X-Auth-Token: accountId:yourToken' \
--data-raw '{"action": "import"}'
{
"success": true,
"process_id": "<uuid-32-formatted-string>"
}
URL parameters
Field | Type | Description |
---|---|---|
siteId | integer | Site you want to trigger processing for |
HTTP headers
Name | Value |
---|---|
Content-Type | application/json |
The data to be inserted has to be provided as a JSON-Object.
Request body fields
Field | Type | Mandatory | Description |
---|---|---|---|
action | string | mandatory | Action value |
id | integer | optional | Export or channel id, only required for action types export and channel |
batch_id | string | optional | A batch id that was recently committed, only applicable to Platform API batches (see below for more information) |
- Platform API data is not immediately available for import after committing. There is a delay due to architectural
reasons, with
batch_id
-parameter you can ensure that the process will be triggered once your batch is ready for import. - Stream API data is immediately available for import once uploaded. The parameter
batch_id
is not relevant when importing from Streams. The parameter is unrelated to the Stream API Batch ID.
Action value explanation
Action value | Description |
---|---|
import | Trigger an import on the site |
export | Trigger an export, export id is required (old style of exporting) |
channel | Trigger a channel, channel id is required (new style of exporting) |
export-all | Trigger all exports and channels |
all | Trigger an import, all exports and channels |
Response body fields
Field | Type | Description |
---|---|---|
success | boolean | Indicates status of job scheduling on the Jenkins server |
process_id | string | Process Identifier aka PID (format: UUID 32 - only on success) |
message | string | On failure this field will indicate why the request failed |
Response status codes
Code | Message | Details |
---|---|---|
200 | N/A | The process was successfully called or scheduled |
429 | Too many attempts | The API is rate-limiting your request. See rate limiting for more information. |
429 | Cannot trigger a new process, a process is already in the current queue. | This indicates that our job queue for this site still has a job that is queued. Retrying in this case makes little sense, since the current job in the queue needs to be started before another job can be queued or ran. |
500 | Could not trigger job |
A previous request for this site has locked any further requests. The lock is released once the previous request has returned a response. |
500 | Error occurred while interacting with the job server | This indicates a problem with our job running infrastructure. A retry can be done to ensure it wasn't a single occurrence. However continuously retrying on this error, just causes more problem. |
Site Stream Data Sources
The data source management endpoint allows you to link your streams to sites programmatically. You can create Stream API datasources and link them to sites, update them, delete or get more information about existing links
List stream datasources
curl --header 'X-Auth-Token: value' \
--header 'Content-Type: application/json' \
--request GET 'https://platform-api.productsup.io/platform/v2/sites/1/streams'
# result:
{
"success": true,
"Sources": [
{
"id": 1,
"site_id": 1,
"description": "stream api",
"source": "",
"import_type": 1,
"import_id": 331,
"status": "active",
"settings": [
"stream : 1"
]
}
],
"meta": {
"cursor": {
"current": 0,
"prev": 0,
"next": 1,
"count": 1
}
}
}
List stream datasources
List all stream datasources linked to a site.
HTTP Request
GET https://platform-api.productsup.io/platform/v2/sites/<siteId>/streams
URL parameters
Field | Type | Description |
---|---|---|
siteId | integer | Using a site identifier (numeric value) |
Query fields
HTTP headers
Name | Value |
---|---|
Content-Type | application/json |
Response fields
Field | Type | Description |
---|---|---|
success | boolean | Indicates request status |
Sources | array | List of datasources |
meta | array | Cursor details Cursor fileds |
Datasource fields
Field | Type | Description |
---|---|---|
id | integer | DataSource identifier |
site_id | integer | Identifier of the site this datasource belongs to |
description | string | Description of the datasource |
status | string | List of valid statusses |
source | string | URL of datasource |
import_type | integer | List of valid import type |
import_id | integer | Type of import method |
settings | array | List of all settings related to datasource |
Datasource status information
Value for status | Description |
---|---|
active | The datasource is fully operational |
paused | The datasource is paused and the data is not imported |
Datasource import type information
Value for import type | Description |
---|---|
1 | Main Data Feed |
2 | Additional Data Feed |
Response status codes
Code | Message | Details |
---|---|---|
200 | Successfully return the list. | The list of all datasources related to a site is returned. |
403 | You don't have the rights to access this resource. | Lack of permissions to access the site . |
404 | The requested resource could not be found. | The site doesn't exist. |
401 | Unauthorized | Invalid authentication token used. |
Create stream datasource
curl --header 'X-Auth-Token: value' \
--header 'Content-Type: application/json' \
--request POST 'https://platform-api.productsup.io/platform/v2/sites/1/streams' \
--data-raw '{"import_type":1, "description":"stream api", "stream_id":1 , "status":"active"}'
# result:
{
"success": true,
"Sources": [
{
"id": 1,
"site_id": 1,
"description": "stream api",
"source": "",
"import_type": 1,
"import_id": 331,
"status": "active",
"settings": [
"stream : 1"
]
}
]
}
Create stream datasource
Create a datasource that read data from Stream
HTTP Request
POST https://platform-api.productsup.io/platform/v2/sites/<siteId>/streams
URL parameters
Field | Type | Description |
---|---|---|
siteId | integer | Using a site identifier (numeric value) |
HTTP headers
Name | Value |
---|---|
Content-Type | application/json |
Request body fields
Field | Type | Description |
---|---|---|
stream_id | integer | Using a steam identifier (numeric value) |
import_type | integer | List of valid import type |
description | string | Description of the datasource |
status | string | List of valid statusses |
Request default values (if not included in request body)
Field | Value |
---|---|
import_type | 1 |
status | active |
Response fields
See response fields
Datasource fields
Response status codes
Code | Message | Details |
---|---|---|
201 | The stream datasource for the site was created successfully. | The datasource was created with stream configuration and linked to the site. |
403 | You don't have the rights to access this resource. | Lack of permissions to access the site or the stream. |
404 | The requested resource could not be found. | The site or the stream doesn't exist. |
422 | The stream datasource for the site already exists. | The stream already linked it to the site. |
401 | Unauthorized | Invalid authentication token used. |
Update stream datasource
curl --header 'X-Auth-Token: value' \
--header 'Content-Type: application/json' \
--request PUT 'https://platform-api.productsup.io/platform/v2/sites/1/streams/1' \
--data-raw '{"import_type":1, "description":"stream api", "status":"active"}'
# result:
{
"success": true,
"Sources": [
{
"id": 1,
"site_id": 1,
"description": "stream api",
"source": "",
"import_type": 1,
"import_id": 331,
"status": "active",
"settings": [
"stream : 1"
]
}
]
}
Update stream datasource
Update the datasource that is linked to the stream
HTTP Request
PUT https://platform-api.productsup.io/platform/v2/sites/<siteId>/streams/<streamId>
URL parameters
Field | Type | Description |
---|---|---|
siteId | integer | Using a site identifier (numeric value) |
streamId | integer | Using a steam identifier (numeric value) |
HTTP headers
Name | Value |
---|---|
Content-Type | application/json |
Request body fields
Field | Type | Description |
---|---|---|
import_type | integer | List of valid import type |
description | string | Description of the datasource |
status | string | List of valid statusses |
Response fields
See response fields
Datasource fields
Response status codes
Code | Message | Details |
---|---|---|
200 | The stream datasource for the site was updated successfully. | The datasource was updated. |
403 | You don't have the rights to access this resource. | Lack of permissions to access the site or the stream. |
404 | The requested resource could not be found. | The site or the stream doesn't exist. |
422 | The stream datasource for the site doesn't exist. | The stream is not linked to the site. |
401 | Unauthorized | Invalid authentication token used. |
Delete stream datasource
curl --header 'X-Auth-Token: value' \
--header 'Content-Type: application/json' \
--request DELETE 'https://platform-api.productsup.io/platform/v2/sites/1/streams/1'
# result:
{
"success": true,
"message": "Resource was deleted successfully!"
}
Update stream datasource
Delete the datasource that is linked to the stream
HTTP Request
DELETE https://platform-api.productsup.io/platform/v2/sites/<siteId>/streams/<streamId>
URL parameters
Field | Type | Description |
---|---|---|
siteId | integer | Using a site identifier (numeric value) |
streamId | integer | Using a steam identifier (numeric value) |
HTTP headers
Name | Value |
---|---|
Content-Type | application/json |
Response fields
Field | Type | Description |
---|---|---|
success | boolean | Indicates request status |
message | string | Indicates request message |
Response status codes
Code | Message | Details |
---|---|---|
200 | The stream datasource for the site was deleted successfully. | The datasource was deleted. |
403 | You don't have the rights to access this resource. | Lack of permissions to access the site or the stream. |
404 | The requested resource could not be found. | The site or the stream doesn't exist. |
422 | The stream datasource for the site doesn't exist. | The stream is not linked to the site. |
401 | Unauthorized | Invalid authentication token used. |
Status
The status endpoint can be used to get a status of a Process Identifier (PID) for a specific site. A valid pid must always be given as secondary parameter . When calling the Process endpoint, it will return a valid pid. If an invalid PID will be given, the sites status will always be queued.
Get
Get the status of PID for a site
curl -i -L -X GET \
'https://platform-api.productsup.io/platform/v2/sites/<siteId>/status/<pid>'
{
"success": true,
"status": "failed",
"links":[
{
## Only available when status equals failed
"errors": "https://platform-api.productsup.io/platform/v2/sites/<siteId>/errors?pid=<pid>"
},
{
"self": "https://platform-api.productsup.io/platform/v2/sites/<siteId>/status/<pid>"
}
]
}
HTTP Request
GET https://platform-api.productsup.io/platform/v2/sites/<siteId>/status/<pid>
URL parameters
Field | Type | Description |
---|---|---|
siteId | integer | Site to which the PID belongs |
pid | string | The PID you want to check the status of |
Response body fields
Field | Type | Description |
---|---|---|
success | boolean | Indicates status of the request |
status | string | Indicates the status of the pid |
links | array | List of links, to resource itself and error resource (if status equals failed) |
Status value explanation
Status value | Description |
---|---|
queued | This default status means that the provided PID is valid, but the site hasn't started running yet. Note: The platform validates only the PID format and doesn't check its existence as it is resource-consuming. If the provided PID is valid but non-existing, the status doesn't change. |
running | The platform is currently processing the site's content. |
success | The platform has completed the site processing without errors. |
failed | The platform has completed the site processing, but some errors occurred. |
API Errors
Example error response
{
"success": false,
"message": "Forbidden, your auth token seems to be invalid"
}
<?php
try {
// code that might result in an error from the API
} catch (\Productsup\Exceptions\ServerException $e) {
// A exception at the API Server happened, should not happen but may be caused by a short down time
// You may want to retry it later, if you keep getting this kind of exceptions please notice us.
throw new Exception('Error at the productsup API, retry later');
} catch (\Productsup\Exceptions\ClientException $e) {
// Most likely some of the data you provided was malformed
// The error codes follow http status codes, @see http://en.wikipedia.org/wiki/List_of_HTTP_status_codes#4xx_Client_Error
// The message may give more information on what was wrong:
echo $e->getCode().' '.$e->getMessage();
} catch (\Exception $e) {
// Exceptions not within the Productsup namespace are not thrown by the client, so these exceptions were most likely
// thrown from your application or another 3rd party application
// however, if you don't catch Productsup exceptions explicitly, you can catch them all like this
echo $e->getCode().' '.$e->getMessage();
throw $e;
}
The API follows HTTP Status Codes for error handling. As a body again JSON is returned. The "message" will provide more information about the specific error.
The API can return the following status codes:
Error Code | Meaning |
---|---|
400 | Bad Request -- Your request was malformed |
401 | Unauthorized -- Invalid authentication token used |
403 | Forbidden -- The entity requested is hidden for administrators only |
404 | Not Found -- The specified entity could not be found |
405 | Method Not Allowed -- You tried to access a entity with an invalid method |
406 | Not Acceptable -- You requested a format that isn't json |
410 | Gone -- The entity requested has been removed from our servers |
500 | Internal Server Error -- We had a problem with our server. Try again later. |
503 | Service Unavailable -- We're temporarily offline for maintenance. Please try again later. |
Rate limiting
Different endpoints of the Platform API are subject to different rate limits. We enforce rate limits at an account and route level.
The Platform API applies the following rate limits to all requests coming from one account:
Endpoint | Description | Rate limit |
---|---|---|
/platform/v2/sites/{siteIdentifier}/products/{batchId}/upload | Products endpoint for uploading batches | 15 requests per second |
/platform/v2/sites/{siteIdentifier}/products/{batchId}/commit | Products endpoint for committing uploaded batches | 15 requests per second |
/platform/v2/sites/{siteIdentifier}/products/{batchId}/discard | Products endpoint for discarding batches | 15 requests per second |
/platform/v2/process/{siteIdentifier} | Process endpoint for processing data | 1 request per site per 5 minutes |
/product/v2/site/{siteIdentifier}/stage/{stageName}/{stageId} | Products endpoint for reading data | 60 requests per minute |
/* | All other endpoints (the limit applies to each endpoint individually) | 5 requests per second |
HTTP headers
The response headers of all HTTP requests sent to the Platform API show your current rate limits:
$ curl -I https://platform-api.productsup.io
HTTP/2 200
X-RateLimit-Limit: 5
X-Ratelimit-Remaining: 1
X-Ratelimit-Retry-After: 1651174223
Name | Description |
---|---|
X-RateLimit-Limit | The maximum number of requests allowed per second. |
X-Ratelimit-Remaining | The number of requests remaining in the current rate-limiting window. |
X-Ratelimit-Retry-After | The Unix time when the current rate-limiting window resets. |
When you exceed the rate limit, you get an error response:
HTTP/2 429
X-RateLimit-Limit: 5
X-Ratelimit-Remaining: 1
X-Ratelimit-Retry-After: 1651174223
{"success":false,"message":"Too many attempts."}
Changelog
Introduction
This section is a record of all notable changes made to Stream API and Platform API.
Stream API changelog
Here are the recent changes introduced to the Stream API:
01/08/2024
- Supporting gzip compression to upload compressed file. See Compression for more intormation.
21/03/2024
- Clarified allowed payload formats for the Stream API when uploading data. Chunked Streams accept
application/x-ndjson
, and Referenced Streams acceptapplication/json
orapplication/x-ndjson
.
08/01/2024
- The Batches API endpoint is now only available upon request. If you need access to it, reach out to your Client Success Manager.
29/11/2023
- Added a limit of 100 streams per account. If you need more streams, reach out to your Client Success Manager.
25/09/2023
- Released version 0.4 of the Open API definition file.
What's Changed
- Added enumerated possible response codes (e.g.,
400
,422
,500
) to all endpoints for improved error handling. - Modified schema format and maximum values for the
page[limit]
andpage[offset]
query parameters on the getstreams
endpoint. - Specified the maximum number of items found in the
$.error.errors
responses for all endpoints. - Set a maximum length of
499900
bytes for theid
attribute on theproducts
upload endpoint. Any product with anid
attribute exceeding this limit will be rejected. - Updated the format and defined the maximum length for the
createdAt
andupdatedAt
properties in thestreams
endpoint response and thecompletedAt
property in thebatches
endpoint response. - Added the format, minimum, and maximum specifications for the
successCount
anderrorCount
properties in thebatches
endpoint response. - Limited the maximum number of distinct errors that the
batches
endpoint response can return.
01/06/2023
- Added a status page link
18/05/2023
- Changed the status code from 400 to 422 when an "Empty Payload" error occurs for an uploaded batch
on the
/streams/{streamId}/products
endpoint. - Fixed a bug in the
/streams/{streamId}/products
endpoint response where therelationships
object was incorrectly placed at the root path instead of being placed as a child of thedata
object. - Changed
Location
toContent-Location
in the header of the response that thePOST /streams
operation returns for a newly created stream.
⚠️ Deprecations:
- The
relationships
object found at the$.relationships
path of the/streams/{streamId}/products
endpoint's response is deprecated. Use$.data.relationships
instead. - The
Location
header found in the response of thePOST /streams
operation is deprecated. UseContent-Location
instead.
31/03/2023
- Increased the rate limit for the "batches" endpoint to 100 requests per second per
{streamId}
. The documented rate limits are now enforced as strict limits. Crossing them causes throttling.
11/01/2023
- Changed "id" as a string in the OpenAPI spec.
- Fixed a bug with supporting the charset part of the Content-Type header.
Platform API changelog
Here are the recent changes introduced to the Platform API:
02/09/2024
- Added tag endpoints. See tag endpoints for more information.
24/05/2024
- Exposed error "type" and "classification" in the site errors endpoint. See errors endpoint for more information.
05/09/2023
- Added a new API endpoint to copy sites. See copy site for more information.
23/05/2023
- Introduced new API rate limits for all endpoints. See rate limiting for more information.
01/06/2023
- Added status page link
15/02/2023
- Fixed the empty file creation handling when rows were written because of a one-hour delay between the last upload and the commit. See upload and commit rule for more information.
31/01/2023
- Fixed the issue with the products endpoint returning a response with an empty response body.
20/01/2023
- Exposed datetime in the site errors endpoint. See errors endpoint for more information.
10/01/2023
- Fixed a bug in deleteNotMatchingFeedHeadersIds.