Artificial Intelligence

How to Run the LLAMA Web UI on Collab or Locally?

After ChatGPT, a game-changing Large language model is released by Meta, the LLama2. The great thing about LLama2 is that it is open-sourced. Nick Clegg, the President of Global Affairs at Meta, shared with BBC Radio 4’s Today programme on Wednesday that open-sourcing Large Language Models (LLMs) could enhance their safety and quality by exposing them to external review. He emphasized the value of collective wisdom in improving these systems, and, notably, in distancing them from the sole control of big tech firms. These companies are currently the only ones equipped with the computational power or extensive data pools necessary for building these models.

Furthermore, Meta’s move to open-source could potentially disrupt the competitive landscape by enabling any entity to develop a rival to chatbots like ChatGPT, Bard, or Microsoft’s Bing. This could be seen as a way to dilute the competitive advantage of tech giants like Google.

In this article, I am going to show you how to run the latest model in your Google Colab free account and interact with the web UI Gradio deployment. I am using Camanduru’s GitHub repository for exporting the code and run it on Google Colab.

All credit goes to Camanduru

  1. First, go to this repository :- repo
  2. Click on llama-2–7b-chat.ipynb file there

3. Then you will be redirected here:

Copy the whole code, paste it in your Google Colab, and run it.

Note: Switch your hardware accelerator to GPU and GPU type to T4 before running it.

4. After running the code, you will get a gradio live link to the web UI chat interface of LLama2

5. Click on the link and input your prompts

Conclusion

In conclusion, Meta, under the leadership of Mark Zuckerberg, has recently introduced an open-source version of an artificial intelligence model named Llama 2. This Large Language Model (LLM) can be leveraged to develop chatbots similar to ChatGPT and is accessible to startups, established businesses, and individual operators. Nick Clegg, Meta’s President of Global Affairs, expressed on BBC Radio 4’s Today programme that open-sourcing LLMs can potentially enhance their safety and quality by allowing for external evaluation. This move not only democratizes access to advanced AI technology but also invites a diverse range of perspectives to improve its development and application, potentially disrupting the current tech landscape.

More content at PlainEnglish.io.

Sign up for our free weekly newsletter. Follow us on Twitter, LinkedIn, YouTube, and Discord.


How to Run the LLAMA Web UI on Collab or Locally? was originally published in Artificial Intelligence in Plain English on Medium, where people are continuing the conversation by highlighting and responding to this story.

https://ai.plainenglish.io/how-to-run-the-llama-web-ui-on-collab-or-locally-ecbbc34e3c88?source=rss—-78d064101951—4
By: M Vaseem
Title: How to Run the LLAMA Web UI on Collab or Locally?
Sourced From: ai.plainenglish.io/how-to-run-the-llama-web-ui-on-collab-or-locally-ecbbc34e3c88?source=rss—-78d064101951—4
Published Date: Mon, 24 Jul 2023 10:16:58 GMT

Leave a Reply

Your email address will not be published. Required fields are marked *