The first session had about 70 participants (September 2, 2024), the second session had 100 participants (September 5, 2024). Unfortunately, we only found out on the day that Google Meet has a maximum limit of 100 people, so some people couldn't participate. The third session will be held on September 9, 2024.
Scientific Maker AI Project
Recently, there have been many AI-related discussions in the main Facebook group 'Scientific Maker'. The group has set up multiple Discord servers for public AI usage.
The main server 'ScientificMakerCampus' offers various AI models for project supporters to use, with rapid iterative updates over the past six months.
Secondary servers 'Health Consultation med4o', 'Financial and Business Consultation rich4o', 'Legal Consultation law4o' use the most advanced AI large language models and databases to provide free consultations to everyone, making society more fair and reasonable.
The participation method for the Scientific Maker AI Project is still evolving and should follow the rules announced for each session.
Four Steps to Use Large Language Models Locally
Apply for Hugging Face Account ➜ Download Large Language Model ➜ Download Msty ➜ Basic Operations of Msty
By completing these four steps, you can use large language models on your computer and adjust the model responses to your preferences. The following sections will introduce each step:
(1) Apply for a Hugging Face Account

Hugging Face is a platform where the machine learning community collaborates on models, datasets, and applications. It offers various large language models for download, some free and some requiring application. You need to create a Hugging Face account before downloading.
1. Go to the Hugging Face official website and click the Sign Up button

2. Enter your desired email and password. The email will be your login account.

3. Enter your user ID, real name, and check the box to agree to the usage policy.

4. After completion, you will return to the Hugging Face homepage in a logged-in state.

5. Go to your email and click the link in the Hugging Face verification email to proceed with subsequent operations.

6. Return to the Hugging Face page to see the verification success message.

7. Enter 'SciMaker' in the search box to see the group's current model list.

(2) Download Large Language Models
SciMaker currently has three large language models, with more to come in the future. This article explains two of them.
TaiwanPro Download Steps

1. Log in to Hugging Face and find SciMaker/TaiwanPro-Llama-3.1-8B in the search box.

2. Click the apply button to request access to TaiwanPro. Note: Currently, there are only two ways to apply for download: using SciSpot points or making a small donation to the 'Scientific Maker AI Project'.

3. Click 'your settings' to check the application status.

4. On the application status page, 'PENDING' means the application has been submitted but not yet approved.

5. If you try to download the model before approval, it won't be successful. You can skip to (3) Download Msty while waiting for notification.

6. When approved, you'll receive an email notification. Click 'in your settings' in the email to go to the Hugging Face page and check the application status.

7. The SciMaker/TaiwanPro-Llama-3.1-8B in the application list will change to ACCEPTED.

8. Return to the TaiwanPro-Llama-3.1-8B page to see the approval message.

9. Switch to the 'Files and versions' page.

10. Click the download button to start downloading TaiwanPro-Llama-3.1-8B.

Qwen2-0.5B_Q4_test Download Steps

1. Log in to Hugging Face and find SciMaker/Qwen2-0.5B_Q4_test in the search box.

2. Switch to the 'Files and versions' page.

3. Click the download button to start downloading Qwen2-0.5B_Q4_test.

(3) Download Msty

Msty is a tool that allows users to run various large language models on their local computer with a beautiful interface. Anyone can learn to use it quickly.
1. Go to the Msty official website and click the 'Download Msty' button.

2. Please choose to download and install the version that corresponds to your computer.

3. After successful download, you'll see the interface below. First, click the 'SETUP LOCAL AI' button to install Msty's default local large language model Gemma2 (you can chat with different large language models simultaneously).

4. When the progress bar reaches 100%, it means Gemma2 download is complete.

(4) Basic Operations of Msty
If you want to use different large language models in Msty, you just need to import them individually. The following will explain how to import TaiwanPro-Llama-3.1-8B and Qwen2-0.5B_Q4_test into Msty, and their respective settings.
Using TaiwanPro-Llama-3.1-8B in Msty

1. Click the 'Computer' button in the left menu.

2. Click the 'Import GGUF Model...' button.

3. Select the TaiwanPro-Llama-3.1-8B model you just downloaded.

4. Name this model, for example: TaiwanPro.

5. In the 'Prompt Template' section, select Llmma3 Instruct as the template. Note: When importing different large language models, you need to select the corresponding template.

6. After selection, click the 'Create' button in the bottom right corner to start creating a new model chat interface.

7. After creation, you'll see a brief green box notification message, then you can go to the 'Installed Models' tab to view the list of currently imported models.

8. Seeing TaiwanPro-Llama-3.1-8B means successful import. Press the 'X' button in the top right to close the window.

9. Move your mouse to the Misc position, and a small button will appear. Click the button with the 'message icon and plus sign' to open a new chat.

10. From the menu below, switch the language model. Select TaiwanPro (the name you just gave).

11. Enter the content you want to chat about in the input box.

12. You'll get TaiwanPro's response.

Using Qwen2-0.5B_Q4_test in Msty

1. Click the 'Computer' button in the left menu.

2. Click the 'Import GGUF Model...' button.

3. Select the Qwen2-0.5B_Q4_test model you just downloaded.

4. Name this model, for example: Qwen2.

5. In the 'Prompt Template' section, select ChatML as the template. Note: When importing different large language models, you need to select the corresponding template.

6. After selection, click the 'Create' button in the bottom right corner to start creating a new model chat interface.

7. After creation, you'll see a brief green box notification message, then you can go to the 'Installed Models' tab to view the list of currently imported models.

8. Seeing Qwen2-0.5B_Q4_test means successful import. Press the 'X' button in the top right to close the window.

9. Move your mouse to the Misc position, and a small button will appear. Click the button with the 'message icon and plus sign' to open a new chat.

10. From the menu below, switch the language model. Select Qwen2 (the name you just gave).

11. Enter the content you want to chat about in the input box.

12. You'll get Qwen2's response.

Advantages of Using Large Language Models in Msty
(1) Simple and Easy to Use: The interface is like a chat window, anyone can learn it easily, and you can also use more advanced parameter settings.
(2) Protects Personal Privacy: Input information stays only on your computer.
(3) Built-in Assistant Roles: When asking questions, you can choose specific roles, with over 230 options, making the responses more suitable for your needs.

(4) Can Compare Multiple Models: Use the split window feature to use different large language models simultaneously and compare their responses.

(5) Built-in RAG Function: Can make large language models respond based on different data sources. Can import personal documents, connect to Obsidian vaults, link to multiple YouTube videos, etc.

(6) Can Use Web Search: Can answer questions based on real-time web data.

Frequently Asked Questions
Q: Can TaiwanPro-Llama-3.1-8B be used commercially?
A: It can only be used for personal or educational purposes.
Q: What should I do if I selected the wrong template (Prompt Template)?
A: Just import the model again and select the correct template.
Q: What computer hardware specifications are needed to use TaiwanPro-Llama-3.1-8B?
A: It's recommended to have a GPU with 8GB or more VRAM, or a Mac computer with M1, M2, or M3 chips.
Q: What hardware specifications are needed to use Qwen2-0.5B_Q4_test?
A: This is a very small large language model, suitable for most computer hardware.
Q: What should I do if TaiwanPro-Llama-3.1-8B keeps repeating the same content and won't stop?
A: You can add stop parameters in the advanced settings (needs to be set when first importing the model). Click Advanced, in the Parameters section, select stop, and paste the following content.
1PARAMETER stop "<|start_header_id|>"
2PARAMETER stop "<|end_header_id|>"
3PARAMETER stop "<|eot_id|>"
4PARAMETER stop "<|end_of_text|>"

Advanced Questions
Q: Can Msty generate API Keys for other software to use?
A: Currently not possible.
Q: Does TaiwanPro have voice functionality?
A: No.