diff --git a/.github/workflows/main.yml b/.github/workflows/main.yml new file mode 100644 index 0000000..a189c72 --- /dev/null +++ b/.github/workflows/main.yml @@ -0,0 +1,15 @@ +on: + push: + branches: + - main + +jobs: + contrib-readme-job: + runs-on: ubuntu-latest + permissions: write-all + name: A job to automate contrib in readme + steps: + - name: Contribute List + uses: akhilmhdh/contributors-readme-action@v2.3.4 + env: + GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} diff --git a/LICENSE b/LICENSE new file mode 100644 index 0000000..5533a34 --- /dev/null +++ b/LICENSE @@ -0,0 +1,21 @@ +MIT License + +Copyright (c) 2023 Botiverse + +Permission is hereby granted, free of charge, to any person obtaining a copy +of this software and associated documentation files (the "Software"), to deal +in the Software without restriction, including without limitation the rights +to use, copy, modify, merge, publish, distribute, sublicense, and/or sell +copies of the Software, and to permit persons to whom the Software is +furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all +copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, +FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE +AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER +LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, +OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE +SOFTWARE. diff --git a/README.md b/README.md index d4653b4..f56dcb0 100644 --- a/README.md +++ b/README.md @@ -1,7 +1,7 @@
-

+

Botiverse is Python package that bridges the gap between developers regardless of their machine learning expertise and building chatbots. It offers a diverse set of modern chatbot architectures that are ready to be trained in a high-level fashion while offering optional fine-grained control for advanced use-cases. @@ -10,7 +10,7 @@ offering optional fine-grained control for advanced use-cases. We strongly recommend referring to the [documentation](https://botiverse.readthedocs.io/en/latest/) which also comes with a humble [user guide](https://botiverse.readthedocs.io/en/latest/user_guide.html). ## 🚀 Installation -For standard use, consider +For standard use, consider Python 3.9+ and ```shell pip install botiverse ``` @@ -24,6 +24,9 @@ pip install botiverse[voice] ``` and make sure to also have FFMPEG on your machine, as needed by the unavoidable dependency `PyAudio`. +## 🐞 Bugs, Suggestions or Questions +Please be welcome to submit an issue. + ## 🏮 Basic Demo Import the chatbot you need from `botiverse.bots`. All bots have a similar interface consisting of a read, train and infer method. @@ -41,7 +44,7 @@ bot.infer("Hello there!") ``` ## 💥 Supported Chatbots -Botiverse offers 7 main chatbot architectures that cover a wide variety of use cases: +Botiverse offers 7 main chatbot architectures that cover a wide variety of use cases: image

@@ -99,7 +102,7 @@ Botiverse offers 7 main chatbot architectures that cover a wide variety of use c - Beyond being a chatbot package, most representers and models can be also used independently and share the same API. For instance, you can import your favorite model or representer from `botiverse.models` or `botiverse.preprocessors` respectively and use it for any ordinary machine learning task. - It follows that some chatbot architectures also allow using a customly defined representer or model as long as it satisfies the relevant unified interface (as in the docs). -Now let's learn more about each chatbot available inn `botiverse.bots` +Now let's learn more about each chatbot available in `botiverse.bots` ## image Basic Bot ### 🏃‍♂️ Quick Example @@ -113,12 +116,12 @@ bot.infer("Hello there!") Please check this for the [documentation](https://botiverse.readthedocs.io/en/latest/botiverse.bots.BasicBot.html) which also includes the [user guide](https://botiverse.readthedocs.io/en/latest/BasicBot.html). ### 🏮 Demo -The following is the result (in its best form) from training the `Basic Bot` on a small synthetic `dataset.json` as found in the examples to - answer FAQs for a university website +The following is the result (in its best form) from training the `Basic Bot` on a small synthetic `dataset.json` as found in the examples to answer FAQs for a university website ![Basic](https://github.com/TheBotiverse/Botiverse/assets/49572294/976edf97-66be-468b-9c1b-3533edd7c3d1) -You can simulate a similar demo offline using the notebook in the `Examples` folder or online on [Google collab](google.com). +You can simulate a similar demo offline using the notebook in the [Examples](https://github.com/TheBotiverse/Botiverse/tree/main/Examples) folder or online on [Google collab](https://colab.research.google.com/drive/1MW4PmQ8BOBkfXO-X-IpzHr6-n8CsXKt7#scrollTo=fifdbsJduJF-). + > Google colab won't have a server to run the chat gui, the options are to use a humble version by setting `server=False` or to provide an [ngrok](https://ngrok.com/) authentication token in the `auth_token` argument. > You will have to manually drop the dataset from the examples folder into the data section in colab. @@ -127,7 +130,6 @@ You can simulate a similar demo offline using the notebook in the `Examples` fol ### 🏃‍♂️ Quick Example ```python bot = WhizBot(repr='BERT') - bot.read_data('./dataset_ar.json') bot.train(epochs=10, batch_size=32) bot.infer("ما هي الدورات المتاحة؟") @@ -141,16 +143,14 @@ The following is the result (in its best form) from training the `Whiz Bot` on a ![Whiz](https://github.com/TheBotiverse/Botiverse/assets/49572294/d85c2825-5061-4d5e-b8b3-67823e93f789) -You can simulate a similar demo offline using the notebook in the `Examples` folder or online on [Google collab](google.com). +You can simulate a similar demo offline using the notebook in the [Examples](https://github.com/TheBotiverse/Botiverse/tree/main/Examples) folder or online on [Google collab](https://drive.google.com/file/d/14sq63p-HLAmWkZXX42aJSI3uCSMfoAkY/view?usp=sharing). > Note that the performance of both the basic bot and whiz bot largely scales with the quality and size of the dataset; the one we use here is a small synthetic version generated by LLMs and could be greatly improved if given time. ## image Basic Task Bot ### 🏃‍♂️ Quick Example ```python -chatbot = BasicTaskBot(domains_slots, templates, domains_pattern, slots_pattern) -bot.read_data('dataset.json') -bot.train() +tbot = BasicTaskBot(domains_slots, templates, domains_pattern, slots_pattern) bot.infer("I want to book a flight") ``` ### 🗂️ User Guide & Docs @@ -162,7 +162,7 @@ The following is the result from building a simple `Basic Task Bot` to perform s ![BasicTask](https://github.com/TheBotiverse/Botiverse/assets/49572294/ea319c36-5574-4434-99cc-67f18ad9593f) -You can simulate a similar demo offline using the notebook in the `Examples` folder or online on [Google collab](google.com). +You can simulate a similar demo offline using the notebook in the [Examples](https://github.com/TheBotiverse/Botiverse/tree/main/Examples) folder or online on [Google collab](https://drive.google.com/file/d/1V0__NnSFjg4DmN_fp_-gI7WmMz2G7_cb/view?usp=sharing). ## image Task Bot @@ -182,8 +182,7 @@ The following is the result from training the `Task Bot` on the [sim-R](https:// ![TaskBot](https://github.com/TheBotiverse/Botiverse/assets/49572294/1bc188a6-c15d-4fad-b72c-421340b7e00c) -You can simulate a similar demo offline using the notebook in the `Examples` folder or online on [Google collab](google.com). - +You can simulate a similar demo offline using the notebook in the [Examples](https://github.com/TheBotiverse/Botiverse/tree/main/Examples) folder or online on [Google collab](https://drive.google.com/file/d/1IgpKFZGX5UKABLfB0fzUJ624ZJoLFEX6/view?usp=sharing). ## image Converse Bot ### 🏃‍♂️ Quick Example @@ -202,7 +201,7 @@ The following is the result from the `Converse Bot` before training on Amazon cu ![Converse](https://github.com/TheBotiverse/Botiverse/assets/49572294/f343a884-86c4-42ab-9339-1c5a34393bd5) -You can simulate a similar demo offline using the notebook in the `Examples` folder or online on [Google collab](google.com). +You can simulate a similar demo offline using the notebook in the [Examples](https://github.com/TheBotiverse/Botiverse/tree/main/Examples) folder or online on [Google collab](https://drive.google.com/file/d/1YCznzhRzv_TDmj1F595THe-6KwQEWf2Y/view?usp=sharing). ## image Voice Bot @@ -215,12 +214,12 @@ bot.simulate_call() Please check this for the [documentation](https://botiverse.readthedocs.io/en/latest/botiverse.bots.VoiceBot.html) which also includes the [user guide](https://botiverse.readthedocs.io/en/latest/VoiceBot.html). An independent submodule of the `voice bot` is a speech classifier which may learn from zero-shot data (synthetic generation). If interested in that then check this for the [documentation](https://botiverse.readthedocs.io/en/latest/botiverse.bots.SpeechClassifier.html) which also includes the [user guide](https://botiverse.readthedocs.io/en/latest/SpeechClassifier.html). ### 🏮 Demo -The following is the result from building a `Voice Bot` on a hand-crafted call state machine as found in the examples. The voice bot requires no training data. +The following is the result from building a `Voice Bot` on a hand-crafted call state machine as found in the [Examples](https://github.com/TheBotiverse/Botiverse/tree/main/Examples). The voice bot requires no training data. https://github.com/TheBotiverse/Botiverse/assets/49572294/cd58965e-3659-4495-baa1-d87da1c01215 -You can only simulate a similar demo offline using the notebook in the `Examples` folder. This applies to both the voice bot and the speech classifier. +You can only simulate a similar demo offline using the notebook in the [Examples](https://github.com/TheBotiverse/Botiverse/tree/main/Examples) folder. This applies to both the voice bot and the speech classifier. ## image Theorizer ### 🏃‍♂️ Quick Example @@ -230,12 +229,75 @@ QAs = generate(context) print(json.dumps(QAs,indent=4)) ``` ### 🗂️ User Guide & Docs -Please check this for the [documentation](https://botiverse.readthedocs.io/en/latest/botiverse.bots.Theorizer.html) which also includes the [user guide](https://botiverse.readthedocs.io/en/latest/Theorizer.html). +Please check this for the [documentation](https://botiverse.readthedocs.io/en/latest/botiverse.Theorizer.html) which also includes the [user guide](https://botiverse.readthedocs.io/en/latest/Theorizer.html). ### 🏮 Demo -No demo is available yet for the Theorizer; you may check the example in the `Examples` folder. - +No demo is available yet for the Theorizer; you may check the example in the [Examples](https://github.com/TheBotiverse/Botiverse/tree/main/Examples) folder. ## 🌆 Models and Preprocessors Most could be indepdendently used in any task; please consult the relevant section of the [documentation](https://botiverse.readthedocs.io/en/latest/) and the `Examples` folder. + +## 👥 Collaborators + + + + + + + + + + + + + + + + + + + +
+ + EssamWisam + + + + YousefAtefB + + + + Muhammad-saad-2000 + + + + Kariiem + +
+ + Essam + + + + Yousef Atef + + + + Muhammad Saad + + + + Kariiem Taha + +
+ Basic Bot and Voice Bot & Relevant Models + + Basic and Deep Task Bot & Relevant Models + + Whiz and Converse Bot & Relevant Models + + Theorizer & Relevant Models +
+ +Sincere thanks to [Abdelrahman Jamal](https://github.com/Hero2323) for helping test the package on Windows.