Skip to content

Commit

Permalink
🌿 Fern Regeneration -- September 13, 2024 (#576)
Browse files Browse the repository at this point in the history
* SDK regeneration

* Fixes

* Type fixes

* images=

* image

---------

Co-authored-by: fern-api <115122769+fern-api[bot]@users.noreply.github.com>
Co-authored-by: Billy Trend <[email protected]>
  • Loading branch information
fern-api[bot] and billytrend-cohere authored Sep 13, 2024
1 parent 6ceb338 commit d85de98
Show file tree
Hide file tree
Showing 27 changed files with 514 additions and 302 deletions.
307 changes: 154 additions & 153 deletions poetry.lock

Large diffs are not rendered by default.

2 changes: 1 addition & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
[tool.poetry]
name = "cohere"
version = "5.9.1"
version = "5.9.2"
description = ""
readme = "README.md"
authors = []
Expand Down
96 changes: 83 additions & 13 deletions reference.md
Original file line number Diff line number Diff line change
Expand Up @@ -135,6 +135,14 @@ Text input for the model to respond to.
Compatible Deployments: Cohere Platform, Azure, AWS Sagemaker/Bedrock, Private Deployments


</dd>
</dl>

<dl>
<dd>

**accepts:** `typing.Optional[typing.Literal["text/event-stream"]]` — Pass text/event-stream to receive the streamed response as server-sent events. The default is `\n` delimited events.

</dd>
</dl>

Expand Down Expand Up @@ -578,14 +586,15 @@ To learn how to use the Chat API with Streaming and RAG follow our [Text Generat
<dd>

```python
from cohere import Client
from cohere import Client, Message_Tool

client = Client(
client_name="YOUR_CLIENT_NAME",
token="YOUR_TOKEN",
)
client.chat(
message="Can you give me a global market overview of solar panels?",
chat_history=[Message_Tool(), Message_Tool()],
prompt_truncation="OFF",
temperature=0.3,
)
Expand All @@ -611,6 +620,14 @@ Text input for the model to respond to.
Compatible Deployments: Cohere Platform, Azure, AWS Sagemaker/Bedrock, Private Deployments


</dd>
</dl>

<dl>
<dd>

**accepts:** `typing.Optional[typing.Literal["text/event-stream"]]` — Pass text/event-stream to receive the streamed response as server-sent events. The default is `\n` delimited events.

</dd>
</dl>

Expand Down Expand Up @@ -1592,14 +1609,7 @@ client = Client(
client_name="YOUR_CLIENT_NAME",
token="YOUR_TOKEN",
)
client.embed(
texts=["string"],
images=["string"],
model="string",
input_type="search_document",
embedding_types=["float"],
truncate="NONE",
)
client.embed()

```
</dd>
Expand All @@ -1615,7 +1625,19 @@ client.embed(
<dl>
<dd>

**texts:** `typing.Sequence[str]` — An array of strings for the model to embed. Maximum number of texts per call is `96`. We recommend reducing the length of each text to be under `512` tokens for optimal quality.
**texts:** `typing.Optional[typing.Sequence[str]]` — An array of strings for the model to embed. Maximum number of texts per call is `96`. We recommend reducing the length of each text to be under `512` tokens for optimal quality.

</dd>
</dl>

<dl>
<dd>

**images:** `typing.Optional[typing.Sequence[str]]`

An array of image data URIs for the model to embed. Maximum number of images per call is `1`.

The image must be a valid [data URI](https://developer.mozilla.org/en-US/docs/Web/URI/Schemes/data). The image must be in either `image/jpeg` or `image/png` format and has a maximum size of 5MB.

</dd>
</dl>
Expand Down Expand Up @@ -2312,8 +2334,13 @@ Generates a message from the model in response to a provided conversation. To le
<dd>

```python
from cohere import Client, ResponseFormat2_Text
from cohere.v2 import ChatMessage2_User, Tool2, Tool2Function
from cohere import Client
from cohere.v2 import (
ChatMessage2_User,
ResponseFormat2_Text,
Tool2,
Tool2Function,
)

client = Client(
client_name="YOUR_CLIENT_NAME",
Expand All @@ -2338,6 +2365,7 @@ response = client.v2.chat_stream(
],
citation_mode="FAST",
response_format=ResponseFormat2_Text(),
safety_mode="CONTEXTUAL",
max_tokens=1,
stop_sequences=["string"],
temperature=1.1,
Expand Down Expand Up @@ -2408,6 +2436,24 @@ Dictates the approach taken to generating citations as part of the RAG flow by a

**response_format:** `typing.Optional[ResponseFormat2]`

</dd>
</dl>

<dl>
<dd>

**safety_mode:** `typing.Optional[V2ChatStreamRequestSafetyMode]`

Used to select the [safety instruction](/docs/safety-modes) inserted into the prompt. Defaults to `CONTEXTUAL`.
When `NONE` is specified, the safety instruction will be omitted.

Safety modes are not yet configurable in combination with `tools`, `tool_results` and `documents` parameters.

**Note**: This parameter is only compatible with models [Command R 08-2024](/docs/command-r#august-2024-release), [Command R+ 08-2024](/docs/command-r-plus#august-2024-release) and newer.

Compatible Deployments: Cohere Platform, Azure, AWS Sagemaker/Bedrock, Private Deployments


</dd>
</dl>

Expand Down Expand Up @@ -2557,14 +2603,20 @@ Generates a message from the model in response to a provided conversation. To le

```python
from cohere import Client
from cohere.v2 import ChatMessage2_Tool

client = Client(
client_name="YOUR_CLIENT_NAME",
token="YOUR_TOKEN",
)
client.v2.chat(
model="model",
messages=[],
messages=[
ChatMessage2_Tool(
tool_call_id="messages",
tool_content=["messages"],
)
],
)

```
Expand Down Expand Up @@ -2624,6 +2676,24 @@ Dictates the approach taken to generating citations as part of the RAG flow by a

**response_format:** `typing.Optional[ResponseFormat2]`

</dd>
</dl>

<dl>
<dd>

**safety_mode:** `typing.Optional[V2ChatRequestSafetyMode]`

Used to select the [safety instruction](/docs/safety-modes) inserted into the prompt. Defaults to `CONTEXTUAL`.
When `NONE` is specified, the safety instruction will be omitted.

Safety modes are not yet configurable in combination with `tools`, `tool_results` and `documents` parameters.

**Note**: This parameter is only compatible with models [Command R 08-2024](/docs/command-r#august-2024-release), [Command R+ 08-2024](/docs/command-r-plus#august-2024-release) and newer.

Compatible Deployments: Cohere Platform, Azure, AWS Sagemaker/Bedrock, Private Deployments


</dd>
</dl>

Expand Down
32 changes: 17 additions & 15 deletions src/cohere/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -33,10 +33,6 @@
ChatToolCallsChunkEvent,
ChatToolCallsGenerationEvent,
CheckApiKeyResponse,
CitationEndEvent,
CitationStartEvent,
CitationStartEventDelta,
CitationStartEventDeltaMessage,
ClassifyDataMetrics,
ClassifyExample,
ClassifyRequestTruncate,
Expand Down Expand Up @@ -91,7 +87,6 @@
GetConnectorResponse,
GetModelResponse,
JsonResponseFormat,
JsonResponseFormat2,
LabelMetric,
ListConnectorsResponse,
ListEmbedJobResponse,
Expand All @@ -115,9 +110,6 @@
RerankResponseResultsItemDocument,
RerankerDataMetrics,
ResponseFormat,
ResponseFormat2,
ResponseFormat2_JsonObject,
ResponseFormat2_Text,
ResponseFormat_JsonObject,
ResponseFormat_Text,
SingleGeneration,
Expand Down Expand Up @@ -163,9 +155,9 @@
)
from . import connectors, datasets, embed_jobs, finetuning, models, v2
from .aws_client import AwsClient
from .client_v2 import AsyncClientV2, ClientV2
from .bedrock_client import BedrockClient
from .client import AsyncClient, Client
from .client_v2 import AsyncClientV2, ClientV2
from .datasets import (
DatasetsCreateResponse,
DatasetsCreateResponseDatasetPartsItem,
Expand Down Expand Up @@ -218,10 +210,18 @@
ChatToolPlanDeltaEvent,
ChatToolPlanDeltaEventDelta,
Citation,
CitationEndEvent,
CitationStartEvent,
CitationStartEventDelta,
CitationStartEventDeltaMessage,
Content,
Content_Text,
DocumentSource,
JsonResponseFormat2,
NonStreamedChatResponse2,
ResponseFormat2,
ResponseFormat2_JsonObject,
ResponseFormat2_Text,
Source,
Source_Document,
Source_Tool,
Expand All @@ -242,22 +242,22 @@
SystemMessageContentItem,
SystemMessageContentItem_Text,
TextContent,
TextResponseFormat2,
Tool2,
Tool2Function,
ToolCall2,
ToolCall2Function,
ToolContent,
ToolMessage2,
ToolMessage2ToolContentItem,
ToolMessage2ToolContentItem_ToolResultObject,
ToolSource,
Usage,
UsageBilledUnits,
UsageTokens,
UserMessage,
UserMessageContent,
V2ChatRequestCitationMode,
V2ChatRequestSafetyMode,
V2ChatStreamRequestCitationMode,
V2ChatStreamRequestSafetyMode,
)
from .version import __version__

Expand All @@ -274,6 +274,7 @@
"AssistantMessageResponseContentItem",
"AssistantMessageResponseContentItem_Text",
"AsyncClient",
"AsyncClientV2",
"AuthTokenType",
"AwsClient",
"BadRequestError",
Expand Down Expand Up @@ -354,6 +355,7 @@
"ClientClosedRequestError",
"ClientClosedRequestErrorBody",
"ClientEnvironment",
"ClientV2",
"CompatibleEndpoint",
"Connector",
"ConnectorAuthStatus",
Expand Down Expand Up @@ -483,6 +485,7 @@
"SystemMessageContentItem_Text",
"TextContent",
"TextResponseFormat",
"TextResponseFormat2",
"TokenizeResponse",
"TooManyRequestsError",
"TooManyRequestsErrorBody",
Expand All @@ -493,11 +496,8 @@
"ToolCall2",
"ToolCall2Function",
"ToolCallDelta",
"ToolContent",
"ToolMessage",
"ToolMessage2",
"ToolMessage2ToolContentItem",
"ToolMessage2ToolContentItem_ToolResultObject",
"ToolParameterDefinitionsValue",
"ToolResult",
"ToolSource",
Expand All @@ -511,7 +511,9 @@
"UserMessage",
"UserMessageContent",
"V2ChatRequestCitationMode",
"V2ChatRequestSafetyMode",
"V2ChatStreamRequestCitationMode",
"V2ChatStreamRequestSafetyMode",
"__version__",
"connectors",
"datasets",
Expand Down
Loading

0 comments on commit d85de98

Please sign in to comment.