Mercurial > repos > bgruening > chatgpt_openai_api
view chatgpt.xml @ 2:dab494dce303 draft
planemo upload for repository https://github.com/bgruening/galaxytools/tree/master/tools/chatgpt commit d2d08c3866c0f4a2f10372ae15c5dac5ea2d0bf0
author | bgruening |
---|---|
date | Fri, 23 Aug 2024 10:21:21 +0000 |
parents | 08c658e9aa9e |
children | 7770a4bd42e2 |
line wrap: on
line source
<tool id="chatgpt_openai_api" name="chatGPT" version="@TOOL_VERSION@+galaxy@VERSION_SUFFIX@" profile="23.0"> <description>Integrating OpenAI's ChatGPT into Galaxy</description> <macros> <token name="@TOOL_VERSION@">2024</token> <token name="@VERSION_SUFFIX@">1</token> </macros> <requirements> <requirement type="package" version="3.12">python</requirement> <requirement type="package" version="1.35.13">openai</requirement> </requirements> <command detect_errors="exit_code"><![CDATA[ #import os #import re #set $LINK_LIST = [] #for $input in $context: #set file_name, ext = os.path.splitext($input.element_identifier) #if ext == '' #set ext = '.'+$input.ext #end if #set LINK = re.sub('[^\w\-]', '_', str($file_name))+$ext ln -s '$input' '$LINK' && $LINK_LIST.append($LINK) #end for python '$__tool_directory__/chatgpt.py' '$str(",".join($LINK_LIST))' '$question' '$model' '$openai_api_key_file' ]]></command> <configfiles> <configfile name="openai_api_key_file"><![CDATA[ $__user__.extra_preferences.get('chatgpt|api_key', "") ]]></configfile> </configfiles> <inputs> <conditional name="input_type"> <param name="input_type_selector" type="select" label="Choose the model" help="Vision models are capable to have image as input."> <option value="vision" selected="true">Vision models</option> <option value="all">All models</option> </param> <when value="vision"> <param name="model" type="select" optional="false" label="Model" help="Select the model you want to use"> <option value="gpt-4o-mini" selected="true">Affordable and intelligent small model for fast, lightweight tasks (gpt-4o-mini)</option> <option value="gpt-4o">High-intelligence flagship model for complex, multi-step tasks (gpt-4o)</option> <option value="gpt-4-turbo">The previous set of high-intelligence model with vision capabilities (gpt-4-turbo)</option> </param> <param name="context" type="data" multiple="true" optional="false" format="doc,docx,html,json,pdf,txt,jpg,jpeg,png,webp,gif" label="Context" max="500" help="This data will be uploaded to OpenAI's servers for processing."/> </when> <when value="all"> <param name="model" type="select" optional="false" label="Model" help="Select the model you want to use"> <option value="gpt-4o-mini" selected="true">Affordable and intelligent small model for fast, lightweight tasks (gpt-4o-mini)</option> <option value="gpt-4o">High-intelligence flagship model for complex, multi-step tasks (gpt-4o)</option> <option value="gpt-4-turbo">The previous set of high-intelligence model with vision capabilities (gpt-4-turbo)</option> <option value="gpt-4" selected="true">The previous set of high-intelligence model (gpt-4)</option> <option value="gpt-3.5-turbo">A fast, inexpensive model for simple tasks (GPT-3.5-turbo)</option> </param> <param name="context" type="data" multiple="true" optional="false" format="doc,docx,html,json,pdf,txt" label="Context" max="500" help="This data will be uploaded to OpenAI's servers for processing."/> </when> </conditional> <param name="question" type="text" optional="false" label="Question" help="Question about the text provided." area="true"> <validator type="empty_field"/> </param> </inputs> <outputs> <data name="output" format="txt" label="${tool.name} on ${on_string}" from_work_dir="./output.txt"/> </outputs> <tests> <test expect_failure="true" expect_exit_code="1"> <param name="context" value="chatgpt_test.txt" ftype="txt"/> <param name="question" value="What is this?"/> <param name="model" value="gpt-4o-mini"/> <assert_stdout> <has_text text="OpenAI API key is not provided in user preferences!"/> </assert_stdout> </test> </tests> <help><![CDATA[ .. class:: infomark **What it does** This tool leverages OpenAI's ChatGPT API to generate responses based on user-provided context and questions. Users can upload context data in various formats and ask questions related to that data. The tool then uploads the data to a OpenAI server and processes them using the selected ChatGPT model, returning an AI-generated response tailored to the context provided. To utilize this tool, users need to input their OpenAI API key in the user preferences. To obtain an API key, visit API keys page in your OpenAI Dashboard. When you run this tool, your input data is sent to OpenAI's servers using your API-key. OpenAI's models process the data and generate a response based on the context and question provided. After receiving the response from the OpenAI server, the tool returns it to Galaxy and puts it in your history. The files that have been uploaded are then deleted from the OpenAI's server, so they are not stored beyond their necessary use. If the tool fails to delete your uploaded files automatically, you can manually delete them in your openai account page. You might want to check your OpenAI storage from time to time as they also have a quota. For more information on the tool refer to GitHub README_ file. .. _README: https://github.com/bgruening/galaxytools/blob/master/tools/chatgpt/README.md Usage ..... **Input** 1. **Upload Context Data**: Users can upload up to 500 files in formats such as DOC, DOCX, HTML, JSON, PDF, TXT, JPG, JPEG, PNG, WEBP, or GIF. This context data serves as the background information for the question you wish to ask. 2. **Ask a Question**: Once the context data is added, users can pose a question related to the content. The more specific the question, the more tailored the response will be. 3. **Select a Model**: Choose the ChatGPT model that best fits your needs. Information about different models and their pricing can be found on the OpenAI website. **Output** The output is a response generated by ChatGPT, crafted based on the provided context data and the question posed. This response is saved in the `output.txt` file. ]]></help> <citations> <citation type="bibtex"> @misc{openai, author = {OpenAI}, title = {OpenAI's ChatGPT}, howpublished = {\url{https://openai.com/chatgpt}}, year = {2024}, note = {Accessed: 2024-07-26} } </citation> </citations> </tool>