diff --git a/content/integrations/inbound/kafka-connector.textile b/content/integrations/inbound/kafka-connector.textile index a3063306b3..6f6ed1ad7e 100644 --- a/content/integrations/inbound/kafka-connector.textile +++ b/content/integrations/inbound/kafka-connector.textile @@ -1,6 +1,6 @@ --- title: Ably Kafka Connector -meta_description: "The Ably Kafka Connector enables you to send data from Kafka to an Ably channel." +meta_description: "The Ably Kafka Connector sends data from Kafka to an Ably channel in realtime." meta_keywords: "Kafka, Kafka Connector, channel" languages: - none @@ -8,69 +8,157 @@ redirect_from: - /docs/general/kafka-connector --- -The Ably Kafka Connector provides a ready-made integration between "Kafka":https://kafka.apache.org/ and Ably. It allows for realtime event distribution from Kafka to web, mobile, and IoT clients, over Ably's feature-rich, multi-protocol pub/sub channels. If instead you need to send data from Ably to Kafka, use a "Kafka rule":/docs/integrations/streaming/kafka. +The Ably Kafka Connector integrates "Kafka":https://kafka.apache.org/ with Ably for realtime event distribution to web, mobile, and IoT clients via Ably's pub/sub "channels":/docs/channels. -The Ably Kafka Connector is verified by Confluent as "Gold":https://www.confluent.io/hub/ably/kafka-connect-ably, following the guidelines set forth by Confluent's "Verified Integrations Program":https://www.confluent.io/verified-integrations-program/. +To send data from Ably to Kafka, use a "Kafka rule":/docs/integrations/streaming/kafka. -You can use the Ably Kafka Connector to send data from one or more "Kafka topics":https://developer.confluent.io/learn-kafka/apache-kafka/topics/ into Ably "channels":/docs/channels. The following diagram illustrates a typical deployment: +This connector is Confluent "Gold":https://www.confluent.io/hub/ably/kafka-connect-ably Verified, ensuring compliance with Confluent's "Verified Integrations Program":https://www.confluent.io/partners/connect/. You can use it to send data from one or more "Kafka topics":https://www.confluent.io/partners/connect/ to one or multiple Ably channels. Ably Kafka Connector overview -h2(#how). How does the Ably Kafka Connector work? +The Ably Kafka Connector is a sink connector built on top of Kafka "Connect":/https://docs.confluent.io/platform/current/connect/index.html#how-kafka-connect-works. You can deploy it in two ways: +* Self-hosted - Run within your own infrastructure. +* Third-party hosted - Available on the Confluent Platform. -The Ably Kafka Connector is a sink connector built on top of Kafka Connect. It can be self-hosted or hosted with a third-party provider, the most common being the Confluent Platform. You can download it from either "GitHub":https://github.com/ably/kafka-connect-ably or "Confluent Hub":https://www.confluent.io/hub/ably/kafka-connect-ably and install it into your Kafka Connect workers. +Install the Ably Kafka Connector from: +* "GitHub":https://github.com/ably/kafka-connect-ably +* "Confluent Hub":https://www.confluent.io/hub/ably/kafka-connect-ably -The Ably Kafka Connector provides a ready-made integration between Kafka and Ably via your Ably API key. Once installed, you can configure the connector with your Ably API key to enable data from one or more Kafka topics to be published into a single Ably channel or multiple Ably channels. Events are then distributed in realtime to web, mobile, and IoT clients over feature-rich, multi-protocol pub/sub Ably channels optimized for last-mile delivery. +Once installed, configure it with your "Ably API key":/docs/auth#api-keys. This allows Kafka topics to publish data into Ably channels, making messages available in realtime for web, mobile, and IoT clients. -h2(#mapping-options). Mapping options +h2(#mapping). Mapping -The Ably Kafka Connector supports two mapping options: +The Ably Kafka Connector supports two mapping methods: +# "Static":#static - Assigns messages to a fixed Ably channel. +* "Pattern-based":#pattern - Dynamically assigns messages based on the topic or record key. -* Static -* Pattern-based +h3(#static). Static mapping -h3(#static-mapping). Static mapping +Static mapping assigns one or more Kafka topics to a single Ably channel. The channel name stays the same, regardless of the Kafka record. -Static mapping enables the connection of one or multiple Kafka topics to a single Ably channel. Here you provide a string for a configuration, such as for channel name. This stays the same no matter what record the connector is consuming. For example if you configure a channel with @channel="global"@, all Kafka records will be published through that channel. This is useful if, for example, you are a sports site broadcasting updates to millions of subscribers through a single channel. The following configuration provides an example of this: +For example, a sports website streaming live updates can set @channel = basketball@, ensuring all Kafka records, regardless of their Kafka topic, publish to the basketball channel. The following example maps all Kafka topics to the basketball channel: -``` +```[javascript] channel = basketball message.name = news_update ``` -h3(#pattern-mapping). Pattern-based mapping +h3(#pattern). Pattern-based mapping -Pattern-based mapping gives you the ability to map many Kafka topics to many Ably channels following set rules. Pattern-based mapping enables each message, a Kafka record, to be checked to determine the recipient, and then determine which channel to send it through. Pattern-based mapping provides customers with the ability to adjust their configuration by interpolating across record key and record topic values. For example, where the channel name is configured as @channel="channel_#{topic}"@, the message will be published to a channel according to the topic name. There are two configurations that support pattern-based mapping: @channel@ and @message.name@. +Pattern-based mapping dynamically maps multiple Kafka topics to different Ably channels. Each Kafka record determines the target channel. Additionally each message is published to the Ably channel corresponding to its Kafka topic. For example, setting @channel = channel_#{topic}@ routes Kafka messages to a channel matching their topic name. -``` +The following configuration maps Kafka topics to Ably channels based on the topic name: + +```[javascript] channel = channel_#{topic} message.name = message_#{key} ``` -For an @events@ topic, the channel would translate as @channel_events@. For a @remove@ key, the @message.name@ would be configured as @message_remove@. +h3(#mixed). Mixed mapping -It is important to note that you can also use static mapping in conjunction with pattern-based mapping. For example, you could define the channel as @channel_#{topic}@, but @message.name@ as @single_message@. In this instance, the channel will be mapped by pattern and the message name will be mapped statically, as shown in the following configuration: +Mixed mapping combines static and pattern-based mapping. For example, you can dynamically map the channel while keeping the message name static, ensuring messages are routed to topic-specific channels while maintaining a consistent message name. The following configuration maps Kafka topics to Ably channels based on the topic name, while keeping the message name static: -``` +```[javascript] channel = channel_#{topic} message.name = single_message ``` -h2(#schema). Publishing messages with a schema +h2(#publish). Publish messages with a schema + +The Ably Kafka Connector supports messages that include schema information. It converts these messages to JSON before publishing them to Ably. You can also use "Kafka schema Registry":https://docs.confluent.io/platform/current/schema-registry/connect.html and supported converters. + +h3(#configure). Configure a schema conversion + +If messages on the Kafka topic are serialized using "Avro":https://developer.confluent.io/confluent-tutorials/kafka-producer-application/kafka/#produce-avro-records, and schemas are registered in a Schema Registry, configure the connector to convert Avro to JSON. + +Set the following properties in your Kafka Connect configuration: + +```[text] +value.converter=io.confluent.connect.avro.AvroConverter +value.converter.schema.registry.url=https:// +``` + +This configuration ensures Kafka messages are correctly deserialized and transformed before reaching Ably. + +h3(#test). Test Avro messages with Docker compose + +If you're running the Ably Kafka Connector locally using Docker Compose, you can test Avro-serialized messages using the kafka-avro-console-producer CLI. + +Run the following command: + +

+docker-compose exec -T schema-registry kafka-avro-console-producer \
+   --topic topic1 \
+   --broker-list kafka:9092 \
+   --property key.schema='{"type":"string"}' \
+   --property parse.key=true \
+   --property key.separator=":" \
+   --property value.schema='{"type":"record","name":"myrecord","fields":[{"name":"count","type":"int"}]}' \
+   --property schema.registry.url=http://schema-registry:8081 <
+ +This Docker Compose command is used to produce a Kafka record with both an Avro-encoded key and value. It relies on Kafka Schema Registry to validate the schema before sending the message to the Kafka topic named @topic1@. + +When you subscribe to the Ably channel, you should receive the following message: + +```[json] + +{ + "clientId": "Ably-Kafka-Connector", + "connectionId": "VSuDXysgaz", + "data": { + "count": 1 + }, + "extras": { + "kafka": { + "key": "AAAAAKEIa2V5MQ==" + } + }, + "id": "-868034334:0:351", + "name": "topic1_message", + "timestamp": 1653923422360 +} +``` + +The following explains how Ably processes the Kafka message structure: + +* The Avro-formatted key appears in base64 encoding under the @extras.kafka.key@ field. +* The actual message data, which contains @count: 1@, is extracted and formatted as JSON. +* The @name@ field, @topic1_message@, corresponds directly to the Kafka topic. + + + +h2(#publish). Configure the Kafka connector -The Ably Kafka Connector supports "messages that contain schema information":https://github.com/ably/kafka-connect-ably#publishing-messages-with-schema by converting them to JSON before publishing them to Ably. To learn how to use schema registry and supported converters, see "Using Kafka Connect with Schema Registry":https://docs.confluent.io/platform/current/schema-registry/connect.html. +The Ably Kafka Connector sends Kafka messages to Ably channels in real time. You can configure it to define: -h2(#install). Install the connector +* Kafka topics to pull data from. +* Ably channels to send data to. -The Ably Kafka Connector is available to install from "GitHub":https://github.com/ably/kafka-connect-ably or "Confluent Hub":https://www.confluent.io/hub/ably/kafka-connect-ably. +|_. Installation Method |_. Configuration Steps | +| @Docker@ | Create a @docker-compose-connector.properties@ file inside the @/config@ directory. \n - An example file is already available in the repository. | +| @Single connect worker@ | - Provide a configuration file as a command-line argument when running the worker. | +| @Distributed connect workers@ | - Use the Confluent REST API @/connectors@ endpoint. \n - Pass the configuration as JSON. | -View detailed instructions for installation on "GitHub":https://github.com/ably/kafka-connect-ably#install. +h3. Connector configuration properties -h2(#configure). Configure the connector +You must configure these core properties to get the connector working. -Configure the connector to set "properties":https://github.com/ably/kafka-connect-ably#configuration such as which Kafka topics to send data from, and which Ably channel should receive the data. +|_. Property |_. Description | +| @channel@ | The Ably channel to which messages are published. Supports "Dynamic Channel Configuration:":#dynamic-channel-configuration | +| @client.key@ | An Ably API key used for authentication. Must have *publish* capability for the specified channel. | +| @client.id@ | The Ably client ID the connector uses. Defaults to @"kafka-connect-ably-example"@. | +| @name@ | A globally unique name for the connector. Defaults to @"ably-channel-sink"@. | +| @topics@ | A comma-separated list of Kafka topics to publish from. | +| @tasks.max@ | The maximum number of tasks the connector should run. Defaults to @1@. | +| @connector.class@ | The class name for the connector. Must be a subclass of @org.apache.kafka.connect.connector@. Defaults to @io.ably.kafka.connect.ChannelSinkConnector@. | -Note that the configuration method differs depending on whether you are running a single or distributed set of "connect workers":https://docs.confluent.io/home/connect/self-managed/userguide.html#configuring-and-running-workers. + diff --git a/content/integrations/inbound/webhooks.textile b/content/integrations/inbound/webhooks.textile index 89a1e1be59..3a6b707127 100644 --- a/content/integrations/inbound/webhooks.textile +++ b/content/integrations/inbound/webhooks.textile @@ -1,78 +1,87 @@ --- title: Inbound Webhooks -meta_description: "Incoming webhooks provide a way to integrate external web services with Ably." -meta_keywords: "Ably, incoming, inbound, webhooks, inbound webhooks, incoming webhooks, webhook configuration, web services, realtime" +meta_description: “Incoming Webhooks let you integrate external web services with Ably.” +meta_keywords: “Ably, incoming, inbound, Webhooks, Webhook configuration, web services, realtime.” languages: - nodejs redirect_from: - /docs/general/incoming-webhooks --- -While it is possible for external services to publish messages to Ably channels using the "REST API":/docs/api/rest-api#unenveloped and your existing API key and secret, an alternative method is to use incoming webhooks. Incoming webhooks provide a convenient way to configure Ably to integrate with external web services. The incoming webhooks are published to Ably channels as "unenveloped messages":/docs/api/rest-api#unenveloped. +External services can publish messages to Ably channels using the "REST API":/docs/api/rest-api and an "API key":/docs/account/app/api#create. However, a simpler alternative is "incoming Webhooks":#configure. -Many web services generate webhooks as a way of communicating with your web application. These web services often generate webhooks based on interaction with their APIs and infrastructure. To allow Ably to receive these webhooks you need to configure the external web service with a webhook URL. The webhook URL specifies the endpoint for the generated webhook requests, and is defined from within the Ably "dashboard":https://ably.com/dashboard. +Incoming webhooks function like REST publishes, which operates similarly to messages sent through Ably's REST API. When a webhook request is received, its payload is published to an Ably channel as an "unenveloped":/docs/api/rest-api#unenveloped message. This process ensures seamless integration between external services and Ably's messaging system. -You can configure the external web service with an Ably URL and webhook requests are invoked on that endpoint. The webhook data is published to the configured Ably channel as a message. +Many web services generate Webhooks to communicate with applications. These Webhooks trigger based on interactions with their APIs or infrastructure. -Ably also supports "outbound webhooks":/docs/integrations/webhooks. These are webhooks that can be generated by Ably to integrate Ably with other services such as "Zapier":/docs/integrations/webhooks/zapier. +To receive Webhooks using Ably: +# Set up a Webhook URL in the "Ably dashboard":https://ably.com/accounts/any. +# Configure the external service to send Webhooks to this URL. +# Ably publishes the Webhook data as a message to the configured "channel":/docs/channels. -Ably's incoming webhooks support provides a convenient way to configure external services to work with Ably. The incoming webhooks are published as messages on the configured Ably channel, where they can be processed by your application as required. +Ably also supports "outbound Webhooks":/docs/integrations/webhooks, which send data from Ably to other services like "Zapier":/docs/integrations/webhooks/zapier. -h2(#configure). Configuring your incoming webhook + -You configure your incoming webhooks in the @Integrations@ tab of the Ably "dashboard":https://ably.com/dashboard#58: +h2(#configure). Configure an incoming Webhook -1. In the @Incoming Webhooks Setup@ section, click the @Register a new webhook endpoint@ button. -2. Give your webhook a friendly name. -3. Specify a new or existing Ably channel which is going to receive the incoming webhooks. -4. Click the @Generate a URL@ button. -5. Copy the generated webhook URL to the clipboard, and use it to configure your service as required. +Set up incoming Webhooks in the Integrations tab of the "Ably dashboard":https://ably.com/accounts/any: +* Go to the Integrations tab.Incoming Webhooks Setup. +** Click *Register a new Webhook endpoint*. +** *Name* your Webhook. +** *Select an Ably channel* to receive Webhook messages. +** Click *Generate a URL*. +** Copy the generated URL and configure your external service with it. -You can optionally test your webhook is correctly configured using the Curl requests provided in the "dashboard":https://ably.com/dashboard. +You can test the Webhook setup using Curl requests from the "dashboard":https://ably.com/accounts/any. -h2(#test). Test your incoming webhook +h3(#test). Test incoming Webhook -You can now optionally test your incoming webhook with a Curl command such as the following: +Run the following Curl command to simulate an incoming webhook request, replicating how an external service would send a webhook: ```[sh] curl -X POST 'https://rest.ably.io/channels/webhook-test/messages?key={{API_KEY_NAME}}:{{API_KEY_SECRET}}&enveloped=false' \ -H 'content-type: application/json' --data '{"some":"json"}' ``` -This Curl request is equivalent to how an external service would publish a webhook. Note that incoming webhooks are simply REST publishes, and so they reflect the functionality of the "REST publish API":/docs/channels/messages/. +Incoming webhooks function as REST publishes, meaning they follow the same behavior and functionality as the "REST publish API":/docs/api/rest-api#common. -Ably responds with the @channel@ and @messageId@ of the message that results from the inbound webhook: +Ably responds with the @channel@ and @messageId@: ```[json] { "channel": "webhook-test", - "messageId": "4RcVBJsAbC:0" + "messageId": "20xxxxxxx" } ``` -A successful publish responds with a 201 @statusCode@; failure results in a 4xx or 5xx code and an @ErrorInfo@ response body. +A successful request returns a @201@ status. Failures return with an "@ErrorInfo@":/docs/api/rest-sdk/types#error-info response. -h2(#receive). Receiving the incoming webhook +h2(#receive). Receive Webhook messages -You can receive the incoming webhook as you would a normal Ably message on a channel. For example, if the incoming webhook data is in JSON format, you can display the data using the following code: +Incoming Webhooks publish messages to an Ably channel. You can "subscribe":/docs/pub-sub?#subscribe to these messages using the Ably SDK: -```[nodejs] -var ably = new require("ably").Realtime('{{API_KEY_NAME}}:{{API_KEY_SECRET}}'); +```[javascript] +const Ably = require("ably"); + +const ably = new Ably.Realtime('{{API_KEY}}'); -/* Obtain the channel that has been configured to receive webhooks */ -var channel = ably.channels.get('webhook-test'); +const channel = ably.channels.get('webhook-test'); -/* Incoming webhook messages can be displayed using this function */ channel.subscribe((message) => { - console.log("webhook: " + JSON.stringify(message.data)); + console.log(`Webhook received: ${JSON.stringify(message.data)}`); }); ``` -h2(#headers). Specifying optional headers +If the webhook sends JSON data, it will be logged in the console. + +h2(#headers). Optional headers -The request body of incoming webhooks is treated as a message to be published. You can optionally set headers and parameters in your external service, if the service supports this, as described in the "REST API":/docs/api/rest-api#unenveloped documentation. +The request body of incoming webhooks is treated as a message to be published. If the external service allows, you can customize webhook requests by including optional "headers and parameters":/docs/api/rest-api#unenveloped. -The following example shows how the message @name@ could be set via a request header: +The following example demonstrates how to set a message name using the @X-Ably-Name@ header: ```[sh] curl -X POST 'https://rest.ably.io/channels/webhook-test/messages?key=key:secret&enveloped=false' \ @@ -80,7 +89,7 @@ curl -X POST 'https://rest.ably.io/channels/webhook-test/messages?key=key:secret -H 'X-Ably-Name: webhook-message' ``` -Your code to receive the named message is: +Then, filter messages by name: ```[nodejs] channel.subscribe('webhook-message', (message) => { @@ -88,9 +97,4 @@ channel.subscribe('webhook-message', (message) => { }); ``` -"Idempotent publishing":https://faqs.ably.com/what-is-idempotent-publishing is possible by including a unique message identifier in an @X-Ably-MessageId@ header. - -h2(#rate-limiting). Rate limiting - -"Rate-limiting":https://faqs.ably.com/do-you-have-any-connection-message-rate-or-other-limits-on-accounts is applied to webhook message publishes, as with REST publishes. - +To enable "idempotent publishing":/docs/pub-sub/advanced?#idempotency, add a @unique X-Ably-MessageId@ header. diff --git a/content/integrations/webhooks/index.textile b/content/integrations/webhooks/index.textile index a65b162cf3..7a0ab331a9 100644 --- a/content/integrations/webhooks/index.textile +++ b/content/integrations/webhooks/index.textile @@ -19,44 +19,43 @@ redirect_from: - /docs/general/webhooks --- -Outbound Webhooks allow you to configure integration rules that react to "messages being published":/docs/pub-sub or "presence events emitted":/docs/presence-occupancy/presence (such as members entering or leaving) on "channels":/docs/channels. These rules can notify HTTP endpoints, serverless functions or other services for each event as they arise, or in batches. +Outbound Webhooks allow you to send realtime notifications to HTTP endpoints when events are triggered, for example when messages are "published":/docs/pub-sub#publish or "presence":/docs/presence-occupancy/presence#trigger-events events occur on Ably "channels":/docs/channels. Webhooks can be used to: -p(tip). Webhooks are rate limited and are suitable for low to medium volumes of updates. If you expect a high volume of events and messages (averaging more than 25 per second), then you should consider using our "message queues":/docs/integrations/queues or "firehose":/docs/general/firehose as they are more suitable for higher volumes. +* Trigger serverless functions. +* Notify third-party services. +* Deliver event data in batches or as individual messages. -Subscribing to messages on-demand is often best done using our "realtime client libraries":/docs/basics/use-ably or by subscribing to Ably using any of the "realtime protocols we support":https://ably.com/protocols. However, when a persistent subscription is required to push data into third party systems you can use webhooks (for HTTP requests, serverless functions, etc), "Queues":/docs/integrations/queues (data is pushed into our own hosted message queues that you can subscribe to), or "Firehose":/docs/general/firehose (stream events into third party systems such as Amazon Kinesis). - -If you want to be notified as events arise, trigger serverless functions, or invoke an HTTP request to an endpoint, then webhooks are the right choice. For example, if you want to send a welcome message to someone when they become present on a chat channel, you can use webhooks to trigger a serverless function immediately after they enter with using "channel lifecycles":#sources, which in turn can publish a welcome message back to that user on the chat channel. - -In addition, various existing systems, such as Azure Functions, Google Functions, and AWS Lambda rely on HTTP events. Webhooks enable you to integrate with "these systems":#integrations. +Webhooks are "rate-limited":/docs/pricing/limits#integration and best suited for low to medium traffic. If you need to process more than 25 events per second, consider using "Message Queues":/docs/integrations/queues or "Firehose":/docs/integrations/streaming instead. Ably Webhooks Overview -You can configure integration rules from the **Integrations** tab in your "dashboard":https://ably.com/dashboard on a per-app basis which can apply to one or more channels in that app. - -Integration rules can filter by channel naming using a regular expression, for example @^click_.*_mouse$@. This would match the string @click_@ followed by a string followed by @_mouse@, for example, @click_left_mouse@. - h2(#payloads). Webhook Payloads -Webhook payloads are structured data sent to your configured webhook URL whenever specific events occur from "sources":#sources in your Ably application. These payloads provide detailed information about events, such as messages, presence updates, or channel lifecycle changes. Depending on your configuration, payloads can be batched, enveloped, or non-enveloped. +Webhook payloads contain structured event data from your Ably app: + +# "Single request":#single - one request per event. +# "Batched request":#batched - multiple events grouped in one request. +# "Enveloped":#enveloped - extra metadata. +# "Non-enveloped":#non-enveloped: - just raw event data. -h3(#sources). Sources +h3(#sources). Sources of Webhook events -Ably currently supports the following sources for all rule types, in both single and batch mode: +You can configure webhooks to listen for the following event types: -- channel.message := If the source @channel.message@ is selected, you receive notifications when "messages":/docs/messages are published on a channel. -- channel.presence := If the source @channel.presence@ is selected, you receive notifications of "presence events":/docs/presence-occupancy/presence when clients enter, update their data, or leave channels. -- channel.lifecycle := If the source @channel.lifecycle@ is selected, you receive notifications of "channel lifecycle events":/docs/metadata-stats/metadata/subscribe#channel-lifecycle, such as when a channel is created (following the first client attaching to this channel) or discarded (when there are no more clients attached to the channel). -- channel.occupancy := If the source @channel.occupancy@ is selected, you receive notifications of "occupancy events":/docs/presence-occupancy/occupancy, which relate to the number and type of occupants in the channel. +- @channel.lifecycle@ := Triggered when a channel is created or discarded. Note, the channel is created in the Ably service when the client "subscribes":/docs/getting-started/quickstart?#step-3 to it. +- @channel.message@ := Triggered when "messages":/docs/messages are published. +- @channel.occupancy@ := Triggered when the number of users in a channel "changes":/docs/channels/options#subscribe-occupancy. +- @channel.presence@ := Triggered when users enter, leave, or update their "presence":/docs/presence-occupancy/presence. -Note that for scalability reasons, it is recommended that @channel.lifecycle@ and @channel.occupancy@ rules are used instead of @channel.message@ rules on corresponding "metachannels":/docs/metadata-stats/metadata/subscribe. +Use "metachannels":/docs/metadata-stats/metadata/subscribe#metachannels with @channel.lifecycle@ and @channel.occupancy@ for high traffic channels. This reduces event volume and improves scalability compared to just the @channel.message@ "webhook payload":#payloads. -h3(#batching). Single vs Batched requests +h3(#vs). Single vs Batched requests If *Single request* is chosen for a rule, then a @POST@ request will be sent to your specified endpoint/service each time an event occurs. Although this can be useful for some use-cases where the endpoint can only process one message per request, or needs the event as soon as it's available, it can result in the endpoint being overloaded with requests. To avoid this, it's possible to instead make use of *Batch request* instead, which will batch messages sent within a set time frame together. -h4(#single-request). Single request details +h4(#single). Single request details Single request is best suited for scenarios where you're wanting a 1-to-1 relationship between sent messages and events being called. If you are making use of a serverless system which is expecting a single piece of data each time, and then intends to perform some transformation/event following that, then Single request will likely work well for you. If you're using a single server, which has the potential to be overloaded by requests, and can process multiple events per payload sent, Batch request will be a better choice. @@ -70,7 +69,7 @@ h5(#single-rate-limits). Single request rate limits * Webhook requests are made with a default timeout of 15s. If the request fails or times out, Ably retries the request with exponential backoff (base delay 1s, backoff factor sqrt(2), up to a max of 60s) * Multiple requests can be in-flight at once, up to the "max concurrency limit":https://faqs.ably.com/do-you-have-any-connection-message-rate-or-other-limits-on-accounts. If the number of in-flight requests exceeds the max concurrency limit, new messages coming in are placed in a short queue (length 10); if that queue length is exceeded, further messages are rejected -h4(#batch-request). Batch request details +h4(#batched). Batch request details Batch requests are useful for endpoints which have the potential to be overloaded by requests, or simply have no preference requirement for processing messages sent one-by-one. If you are using an endpoint which has either of these requirements (for example "IFTTT":/docs/integrations/webhooks/ifttt requires one event per request), you should use Single request. @@ -99,7 +98,7 @@ The encoding of payloads sent is defined when setting up a rule in the "Integrat * "JSON":https://www.json.org (JavaScript Object Notation): An efficient data-interchange format which is fairly standard and provides simple text based encoding. * "MessagePack":https://msgpack.org: An efficient binary serialization format that is similar to JSON, but smaller. This is notably useful for binary payloads, as a JSON envelope with a binary payload would need to have the payload base64-encoded -h3(#envelope). Envelopes +h3(#enveloped). Enveloped When you configure a rule using "single requests":#batching, you are given the option to envelope messages, which is enabled by default. In most cases, we believe an enveloped message provides more flexibility as it contains additional metadata in a portable format that can be useful such as the @clientId@ of the publisher, or the @channel@ name the message originated from.