Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

OpenAPI spec update from glideapps/glide#29431 #17

Merged
merged 5 commits into from
Sep 19, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 4 additions & 0 deletions api-reference/v2/resources/changelog.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,10 @@ title: Glide API Changelog
sidebarTitle: Changelog
---

### September 18, 2024

- Endpoints that receive tabular data can now accept CSV and TSV request bodies.

### September 13, 2024

- Introduced a new "Limits" document that outlines rate and operational limits for the API.
Expand Down
2 changes: 2 additions & 0 deletions api-reference/v2/stashing/put-stashes-serial.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,8 @@ openapi: put /stashes/{stashID}/{serial}

When using large datasets with the Glide API, it may be necessary to break them into smaller chunks for performance and reliability. We call this process "stashing."

Tabular data may be stashed in JSON, CSV, or TSV format.

<Tip>
To learn more about stashing and how to use it to work with large datasets, please see our [introduction to stashing](/api-reference/v2/stashing/introduction).
</Tip>
2 changes: 2 additions & 0 deletions api-reference/v2/tables/post-table-rows.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,8 @@ openapi: post /tables/{tableID}/rows

Add row data to an existing Big Table.

Row data may be passed in JSON, CSV, or TSV format.

## Examples

<AccordionGroup>
Expand Down
2 changes: 2 additions & 0 deletions api-reference/v2/tables/post-tables.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,8 @@ openapi: post /tables

Create a new Big Table, define its structure, and (optionally) populate it with data.

When using a CSV or TSV request body, the name of the table must be passed as a query parameter and the schema of the table is inferred from the content. Alternatively, the CSV/TSV content may be [stashed](/api-reference/v2/stashing/introduction), and then the schema and name may be passed in the regular JSON payload.

## Examples

<AccordionGroup>
Expand Down
2 changes: 2 additions & 0 deletions api-reference/v2/tables/put-tables.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,8 @@ openapi: put /tables/{tableID}

Overwrite an existing Big Table by clearing all rows and adding new data. You can also update the table schema.

When using a CSV or TSV request body, you cannot pass a schema. The current schema will be used. If you need to update the schema, use the `onSchemaError=updateSchema` query parameter, or [stash](/api-reference/v2/stashing/introduction) the CSV/TSV data and pass a JSON request body.

<Warning>
This is a destructive operation that cannot be undone.
</Warning>
Expand Down
86 changes: 76 additions & 10 deletions openapi/swagger.json
Original file line number Diff line number Diff line change
Expand Up @@ -215,8 +215,17 @@
}
}
},
"description": "Creates a new Big Table",
"parameters": [
{
"name": "name",
"in": "query",
"schema": {
"type": "string",
"description": "Name of the table. Required when the name is not passed in the request body. It is an error to pass a name in both this query parameter and the request body.",
"example": "Invoices"
},
"required": false
},
{
"name": "onSchemaError",
"in": "query",
Expand Down Expand Up @@ -386,9 +395,24 @@
],
"additionalProperties": false
}
},
"text/csv": {
"schema": {
"type": "string",
"description": "A CSV string. The first line is column IDs, and each subsequent line is a row of data. The schema will be inferred from the data. The name of the table must be passed in the query parameter `name`.",
"example": "Name,Age,Birthday\nAlice,25,2024-08-29T09:46:16.722Z\nBob,30,2020-01-15T09:00:16.722Z"
}
},
"text/tab-separated-values": {
"schema": {
"type": "string",
"description": "A TSV string. The first line is column IDs, and each subsequent line is a row of data. The schema will be inferred from the data. The name of the table must be passed in the query parameter `name`.",
"example": "Name\tAge\tBirthday\nAlice\t25\t2024-08-29T09:46:16.722Z\nBob\t30\t2020-01-15T09:00:16.722Z"
}
}
}
}
},
"description": "Creates a new Big Table"
}
},
"/tables/{tableID}": {
Expand Down Expand Up @@ -537,7 +561,6 @@
}
}
},
"description": "Overwrites a Big Table with new schema and/or data",
"parameters": [
{
"name": "tableID",
Expand Down Expand Up @@ -712,9 +735,24 @@
],
"additionalProperties": false
}
},
"text/csv": {
"schema": {
"type": "string",
"description": "A CSV string. The first line is column IDs, and each subsequent line is a row of data.",
"example": "Name,Age,Birthday\nAlice,25,2024-08-29T09:46:16.722Z\nBob,30,2020-01-15T09:00:16.722Z"
}
},
"text/tab-separated-values": {
"schema": {
"type": "string",
"description": "A TSV string. The first line is column IDs, and each subsequent line is a row of data.",
"example": "Name\tAge\tBirthday\nAlice\t25\t2024-08-29T09:46:16.722Z\nBob\t30\t2020-01-15T09:00:16.722Z"
}
}
}
}
},
"description": "Replaces the schema and/or data of a Big Table"
}
},
"/tables/{tableID}/rows": {
Expand Down Expand Up @@ -862,7 +900,6 @@
}
}
},
"description": "Adds rows to a Big Table",
"parameters": [
{
"name": "tableID",
Expand Down Expand Up @@ -935,9 +972,24 @@
}
]
}
},
"text/csv": {
"schema": {
"type": "string",
"description": "A CSV string. The first line is column IDs, and each subsequent line is a row of data.",
"example": "Name,Age,Birthday\nAlice,25,2024-08-29T09:46:16.722Z\nBob,30,2020-01-15T09:00:16.722Z"
}
},
"text/tab-separated-values": {
"schema": {
"type": "string",
"description": "A TSV string. The first line is column IDs, and each subsequent line is a row of data.",
"example": "Name\tAge\tBirthday\nAlice\t25\t2024-08-29T09:46:16.722Z\nBob\t30\t2020-01-15T09:00:16.722Z"
}
}
}
}
},
"description": "Adds rows to a Big Table"
}
},
"/stashes/{stashID}/{serial}": {
Expand Down Expand Up @@ -989,7 +1041,6 @@
}
}
},
"description": "Sets the content of a chunk of data inside a stash",
"parameters": [
{
"name": "stashID",
Expand Down Expand Up @@ -1039,9 +1090,24 @@
}
]
}
},
"text/csv": {
"schema": {
"type": "string",
"description": "A CSV string. The first line is column IDs, and each subsequent line is a row of data.",
"example": "Name,Age,Birthday\nAlice,25,2024-08-29T09:46:16.722Z\nBob,30,2020-01-15T09:00:16.722Z"
}
},
"text/tab-separated-values": {
"schema": {
"type": "string",
"description": "A TSV string. The first line is column IDs, and each subsequent line is a row of data.",
"example": "Name\tAge\tBirthday\nAlice\t25\t2024-08-29T09:46:16.722Z\nBob\t30\t2020-01-15T09:00:16.722Z"
}
}
}
}
},
"description": "Sets the content of a chunk of data inside a stash"
}
},
"/stashes/{stashID}": {
Expand Down Expand Up @@ -1093,7 +1159,6 @@
}
}
},
"description": "Deletes a stash and all its data",
"parameters": [
{
"name": "stashID",
Expand All @@ -1106,7 +1171,8 @@
},
"required": true
}
]
],
"description": "Deletes a stash and all its data"
}
}
},
Expand Down
Loading