Chatbot Respon Langsung Kurang

import openai
from gradio import Interface, Textbox, LiveShare

open_ai = OpenAI(base_url=”http://localhost:5151/v1″, api_key=”lm-studio”)

def chatbot(input_message):
completion = open_ai.chat.completions.create(
model=”mradermacher/Llama-3-8B-Instruct-1048k-GGUF”,
messages=[{“role”: “system”, “content”: “You are an intelligent assistant. You always provide well-reasoned answers that are both correct and helpful.”}, {“role”: “user”, “content”: input_message}],
temperature=0.7,
)
new_message = {“role”: “assistant”, “content”: “”}

for chunk in completion:
if chunk.choices[0].delta.content:
print(chunk.choices[0].delta.content, end=””, flush=True)
new_message[“content”] += chunk.choices[0].delta.content

return new_message[“content”]

live_share = LiveShare(
inputs=Textbox(label=”Enter a message”),
outputs=[Output(type=”text”, label=”Assistant’s response”),],
fn=chatbot,
).launch()
To set up a chatbot using OpenAI’s API with Gradio’s Interface and LiveShare, you’ll need to adjust a few details in your provided code. Here is a refined and complete version:

“`python
import openai
import gradio as gr

# Initialize OpenAI with your provided base URL and API key
openai.api_base = “http://localhost:5151/v1”
openai.api_key = “lm-studio”

def chatbot(input_message):
# Create a chat completion with the specified model
completion = openai.ChatCompletion.create(
model=”mradermacher/Llama-3-8B-Instruct-1048k-GGUF”,
messages=[
{“role”: “system”, “content”: “You are an intelligent assistant. You always provide well-reasoned answers that are both correct and helpful.”},
{“role”: “user”, “content”: input_message}
],
temperature=0.7,
)

# Initialize a new message for the assistant’s response
new_message = {“role”: “assistant”, “content”: “”}

# Collect the response content from the completion
for choice in completion.choices:
new_message[“content”] += choice.message[“content”]

return new_message[“content”]

# Define the Gradio interface with a textbox for input and text output
interface = gr.Interface(
fn=chatbot,
inputs=gr.Textbox(lines=2, placeholder=”Enter a message”),
outputs=gr.Textbox(label=”Assistant’s response”),
)

# Launch the Gradio interface with LiveShare enabled
interface.launch(share=True)
“`

**Key Adjustments:**
1. **Initialization of OpenAI**: Ensure `openai.api_base` and `openai.api_key` are correctly set.
2. **Response Handling**: Simplify the collection of response content from the completion to handle multi-message choices.
3. **Gradio Interface Setup**: Correctly define the Gradio `Interface` and launch it with `share=True` for LiveShare functionality.

This complete script should correctly set up a Gradio-based interface for your chatbot, using the specified local model and OpenAI’s API. Ensure that the server at `http://localhost:5151` is running and accessible.
Can you add code so that the chat history is maintained?
Sure! To maintain the chat history, you’ll need to keep track of the messages exchanged between the user and the assistant. Here is the updated code to include chat history management:

“`python
import openai
import gradio as gr

# Initialize OpenAI with your provided base URL and API key
openai.api_base = “http://localhost:5151/v1”
openai.api_key = “lm-studio”

# Initialize chat history
chat_history = [
{“role”: “system”, “content”: “You are an intelligent assistant. You always provide well-reasoned answers that are both correct and helpful.”}
]

def chatbot(input_message):
# Add user message to chat history
chat_history.append({“role”: “user”, “content”: input_message})

# Create a chat completion with the specified model
completion = openai.ChatCompletion.create(
model=”mradermacher/Llama-3-8B-Instruct-1048k-GGUF”,
messages=chat_history,
temperature=0.7,
)

# Initialize a new message for the assistant’s response
new_message = {“role”: “assistant”, “content”: “”}

# Collect the response content from the completion
for choice in completion.choices:
new_message[“content”] += choice.message[“content”]

# Add assistant’s response to chat history
chat_history.append(new_message)

# Format chat history for display
formatted_history = “n”.join([f”{msg[‘role’]}: {msg[‘content’]}” for msg in chat_history if msg[‘role’] != ‘system’])

return formatted_history

# Define the Gradio interface with a textbox for input and text output
interface = gr.Interface(
fn=chatbot,
inputs=gr.Textbox(lines=2, placeholder=”Enter a message”),
outputs=gr.Textbox(label=”Assistant’s response”, lines=10),
)

# Launch the Gradio interface with LiveShare enabled
interface.launch(share=True)
“`

**Key Adjustments:**
1. **Chat History Initialization**: A `chat_history` list is initialized with the system message.
2. **Appending Messages to History**: User and assistant messages are appended to the `chat_history`.
3. **Formatting Chat History**: The chat history is formatted for display, excluding the system message.

This ensures that the conversation history is maintained and displayed in the output textbox.