Full Changelog: https://github.com/slackapi/python-slack-sdk/compare/v3.40.1...v3.41.0 Milestone: https://github.com/slackapi/python-slack-sdk/milestone/117?closed=1
Full Changelog: https://github.com/slackapi/python-slack-sdk/compare/v3.40.0...v3.40.1 Milestone: https://github.com/slackapi/python-slack-sdk/milestone/116?closed=1
https://github.com/user-attachments/assets/85219072-66ac-4d86-a18b-dd92689036ea
https://github.com/user-attachments/assets/4cb0bb5f-c70f-4304-b875-4dc262640a67
Now, you can display a mixture of structured content called "chunks":
Available in:
chat_startStream, chat_appendStream, and chat_stopStreamstreamer = client.chat_stream(...), streamer.append(...)$ slack create
# → AI Agent App
# Bolt for JavaScript
# → Bolt for Python
streamer = client.chat_stream(
channel=channel_id,
recipient_team_id=team_id,
recipient_user_id=user_id,
thread_ts=thread_ts,
task_display_mode="plan",
)
streamer.append(
chunks=[
MarkdownTextChunk(
text="Hello.\nI have received the task. ",
),
MarkdownTextChunk(
text="This task appears manageable.\nThat is good.",
),
TaskUpdateChunk(
id="001",
title="Understanding the task...",
status="in_progress",
details="- Identifying the goal\n- Identifying constraints",
),
TaskUpdateChunk(
id="002",
title="Performing acrobatics...",
status="pending",
),
],
)
time.sleep(4)
streamer.append(
chunks=[
PlanUpdateChunk(
title="Adding the final pieces...",
),
TaskUpdateChunk(
id="001",
title="Understanding the task...",
status="complete",
details="\n- Pretending this was obvious",
output="We'll continue to ramble now",
),
TaskUpdateChunk(
id="002",
title="Performing acrobatics...",
status="in_progress",
),
],
)
time.sleep(4)
streamer.stop(
chunks=[
PlanUpdateChunk(
title="Decided to put on a show",
),
TaskUpdateChunk(
id="002",
title="Performing acrobatics...",
status="complete",
details="- Jumped atop ropes\n- Juggled bowling pins\n- Rode a single wheel too",
),
MarkdownTextChunk(
text="The crowd appears to be astounded and applauds :popcorn:"
),
],
blocks=create_feedback_block(),
)
Full Changelog: https://github.com/slackapi/python-slack-sdk/compare/v3.39.0...v3.40.0 Milestone: https://github.com/slackapi/python-slack-sdk/milestone/115
Full Changelog: https://github.com/slackapi/python-slack-sdk/compare/v3.39.0...v3.40.0.dev0
Full Changelog: https://github.com/slackapi/python-slack-sdk/compare/v3.38.0...v3.39.0 Milestone: https://github.com/slackapi/python-slack-sdk/milestone/114?closed=1
black by @WilliamBergamin in https://github.com/slackapi/python-slack-sdk/pull/1778Full Changelog: https://github.com/slackapi/python-slack-sdk/compare/v3.37.0...v3.38.0 Milestone: https://github.com/slackapi/python-slack-sdk/milestone/113?closed=1
https://github.com/user-attachments/assets/bc16597b-1632-46bb-b7aa-fe22330daf84
Try the AI Agent Sample app to explore the AI-enabled features and existing Assistant helper:
# Create a new AI Agent app
$ slack create slack-ai-agent-app --template slack-samples/bolt-python-assistant-template
$ cd slack-ai-agent-app/
# Initialize Python Virtual Environment
$ python3 -m venv .venv
$ source .venv/bin/activate
$ pip install -r requirements.txt
# Add your OPENAI_API_KEY
$ export OPENAI_API_KEY=sk-proj-ahM...
# Run the local dev server
$ slack run
After the app starts, send a message to the "slack-ai-agent-app" bot for a unique response.
Loading states allows you to not only set the status (e.g. "My app is typing...") but also sprinkle some personality by cycling through a collection of loading messages:
@app.message()
def handle_message(message, client):
client.assistant_threads_setStatus(
channel_id=channel_id,
thread_ts=thread_ts,
status="thinking...",
loading_messages=[
"Teaching the hamsters to type faster…",
"Untangling the internet cables…",
"Consulting the office goldfish…",
"Polishing up the response just for you…",
"Convincing the AI to stop overthinking…",
],
)
# Start a new message stream
The chat_stream() helper utility can be used to streamline calling the 3 text streaming methods:
# Start a new message stream
streamer = client.chat_stream(
channel=channel_id,
recipient_team_id=team_id,
recipient_user_id=user_id,
thread_ts=thread_ts,
)
# Loop over OpenAI response stream
# https://platform.openai.com/docs/api-reference/responses/create
for event in returned_message:
if event.type == "response.output_text.delta":
streamer.append(markdown_text=f"{event.delta}")
else:
continue
feedback_block = create_feedback_block()
streamer.stop(blocks=feedback_block)
Alternative to the Text Streaming Helper is to call the individual methods.
client.chat_startStreamFirst, start a chat text stream to stream a response to any message:
@app.message()
def handle_message(client, context, event, messsage):
# Start a new message stream
stream_response = client.chat_startStream(
channel=channel_id,
recipient_team_id=team_id,
recipient_user_id=user_id,
thread_ts=thread_ts,
)
stream_ts = stream_response["ts"]
client.chat_appendStreamAfter starting a chat text stream, you can then append text to it in chunks (often from your favourite LLM SDK) to convey a streaming effect:
for event in returned_message:
if event.type == "response.output_text.delta":
client.chat_appendStream(
channel=channel_id,
ts=stream_ts,
markdown_text=f"{event.delta}"
)
else:
continue
client.chat_stopStreamLastly, you can stop the chat text stream to finalize your message:
client.chat_stopStream(
channel=channel_id,
ts=stream_ts,
blocks=feedback_block
)
Add feedback buttons to the bottom of a message, after stopping a text stream, to gather user feedback:
def create_feedback_block() -> List[Block]:
blocks: List[Block] = [
ContextActionsBlock(
elements=[
FeedbackButtonsElement(
action_id="feedback",
positive_button=FeedbackButtonObject(
text="Good Response",
accessibility_label="Submit positive feedback on this response",
value="good-feedback",
),
negative_button=FeedbackButtonObject(
text="Bad Response",
accessibility_label="Submit negative feedback on this response",
value="bad-feedback",
),
)
]
)
]
return blocks
@app.message()
def handle_message(client, context, event, message):
# ... previous streaming code ...
# Stop the stream and add feedback buttons
feedback_block = create_feedback_block()
client.chat_stopStream(
channel=channel_id,
ts=stream_ts,
blocks=feedback_block
)
chat_postMessage supports markdown_textresponse = client.chat_postMessage(
channel="C111",
markdown_text=markdown_content,
)
Learn more in https://github.com/slackapi/python-slack-sdk/pull/1718
📚 https://docs.slack.dev/reference/block-kit/blocks/markdown-block/
from slack_sdk.models.blocks import MarkdownBlock
...
@app.message("hello")
def message_hello(say):
say(
blocks=[
MarkdownBlock(text="**lets's go!**"),
],
text="let's go!",
)
Learn more in https://github.com/slackapi/python-slack-sdk/pull/1748
Add support for the workflows.featured.{add|list|remove|set} methods:
app.client.workflows_featured_add(channel_id="C0123456789", trigger_ids=["Ft0123456789"])
app.client.workflows_featured_list(channel_ids="C0123456789")
app.client.workflows_featured_remove(channel_id="C0123456789", trigger_ids=["Ft0123456789"])
app.client.workflows_featured_set(channel_id="C0123456789", trigger_ids=["Ft0123456789"])
Learn more in https://github.com/slackapi/python-slack-sdk/pull/1712
Milestone: https://github.com/slackapi/python-slack-sdk/milestone/112 Full Changelog: https://github.com/slackapi/python-slack-sdk/compare/v3.36.0...v3.37.0 Package: https://pypi.org/project/slack-sdk/3.37.0/
Milestone: https://github.com/slackapi/python-slack-sdk/milestone/111 Full Changelog: https://github.com/slackapi/python-slack-sdk/compare/v3.35.0...v3.36.0
channels param to files.upload v2 method by @seratch in https://github.com/slackapi/python-slack-sdk/pull/1641channel_id instead of channel in files_upload_v2 documentation by @WilliamBergamin in https://github.com/slackapi/python-slack-sdk/pull/1670conversations.requestShared approve, deny & list by @WilliamBergamin in https://github.com/slackapi/python-slack-sdk/pull/1530conversations.externalInvitePermissions.set API by @WilliamBergamin in https://github.com/slackapi/python-slack-sdk/pull/1517team.externalTeams.disconnect by @WilliamBergamin in https://github.com/slackapi/python-slack-sdk/pull/1526bot_access_tokens from the debug logs of socket mode by @WilliamBergamin in https://github.com/slackapi/python-slack-sdk/pull/1519All issues/pull requests: https://github.com/slackapi/python-slack-sdk/milestone/99?closed=1 Full Changelog: https://github.com/slackapi/python-slack-sdk/compare/v3.29.0...v3.30.0