## Introduction Fabric is an open-source framework for augmenting humans using AI. It provides a modular framework for solving specific problems using a crowdsourced set of AI prompts that can be used anywhere. Well that is what the official GitHub page says at least. What does it actually mean in simple terms? How about I just show it to you? --- ## Showcase ### Extracting Wisdom Recently there was a new episode of the [Waveform Podcast](https://www.youtube.com/@Waveform) usually they release new episodes every Friday, but this time it was Tuesday. I didn't really have time to catch up on it, and I wanted to see if there is something important that I would miss. So I just used this below command. ``` yt https://www.youtube.com/watch?v=KDZytz_To3Y | fabric -s -p extract_wisdom ``` ![](https://i.imgur.com/XH5QpZ5.png) Turns out it was a game show day. I guess I really didn't miss anything important. ### Knowledge Hub So you guys know that in the `#knowledge-hub` [group](https://chat.whatsapp.com/FAqDafywQUZ7yUrYcajg8I) I keep on sharing interesting articles and videos that I find. Now all of that curation is done by me personally, but it takes some time to format those messages. What I have recently started doing is, created a *custom prompt* and started asking fabric to take care of it for me. ``` echo "iOS 18 (and AI) will give Siri much more control over your apps. Link is https://www.theverge.com/2024/5/30/24168175/ios-18-ai-siri-apple-apps The author is Emma Roth. Apple is planning a big AI update for Siri, and it could give you the ability to control specific iPhone app features with your voice, according to a report from Bloomberg. The revamped Siri will reportedly arrive next year as an update to iOS 18. As noted by Bloomberg, the update will allow Siri to analyze the activity on your phone while turning on Siri-controlled features automatically. You’ll only be able to use Siri to control features in apps made by Apple to start, but the company plans on supporting “hundreds” of commands within its apps, Bloomberg reports." | fabric -p knowledge_hub ``` ![](https://i.imgur.com/Edy9nuq.png) ### Raw Queries Of course every now and then I need to ask AI for certain queries, usually happens when I am programming. While I am in the zone, I do not prefer opening up the clunky UI of ChatGPT which once again is not made for keyboard shortcuts. Instead I use the below command in fabric. ``` echo "I need to use an async function can you help me write some skeleton code for it?" | fabric -p raw_query ``` ![](https://i.imgur.com/1GuHyVg.png) --- What you saw above was but a small taste of the whole thing, I just used 3 patterns(`extract_widsom`, `knowledge_hub`, and `raw_query`) in co-junction with the YouTube transcript tool (`yt`). There are more than umm... (hold on let me check something) ``` echo "how do i count the number of folders in a folder in command line linux, just let me know the commands with find, without any explaination or description" | fabric -s -p raw_query ``` ![](https://i.imgur.com/1tlvWPY.png) `110`, There are more than 100 different patterns that you can use. These are basically system prompts that you could give to an LLM for them to behave in a certain way and respond in a particular manner. `find_hidden_message`, `extract_book_ideas`, `create_5_sentence_summary`, and `analyze_presentation` are just to name a few of the interesting patterns that I love. --- So these are just fancy prompts? ![](https://i.imgur.com/iLjIDXW.jpeg) Fabric a lot more customisable as it gives you the ability to change between different models, whenever you want. It gives you a command line interface which by default means that it is programmable. Here we can see all the models available that I can use by my `api-key`. ``` fabric --listmodels ``` Output ``` GPT Models: airoboros-l2-70b auto bagel-34b chronos-hermes-13b cinematika-7b cinematika-7b:free claude-1 claude-1.2 claude-2 claude-2.0 claude-2.0:beta claude-2.1 claude-2.1:beta claude-2:beta claude-3-haiku claude-3-haiku:beta claude-3-opus claude-3-opus:beta claude-3-sonnet claude-3-sonnet:beta claude-instant-1 claude-instant-1.0 claude-instant-1.1 claude-instant-1:beta codellama-34b-instruct codellama-70b-instruct command command-r command-r-plus dbrx-instruct deepseek-chat deepseek-coder dolphin-mixtral-8x7b eagle-7b fimbulvetr-11b-v2 firellava-13b gemini-flash-1.5 gemini-pro gemini-pro-1.5 gemini-pro-vision gemma-7b-it gemma-7b-it:free gemma-7b-it:nitro goliath-120b gpt-3.5-turbo gpt-3.5-turbo-0125 gpt-3.5-turbo-0301 gpt-3.5-turbo-0613 gpt-3.5-turbo-1106 gpt-3.5-turbo-16k gpt-3.5-turbo-instruct gpt-4 gpt-4-0314 gpt-4-1106-preview gpt-4-32k gpt-4-32k-0314 gpt-4-turbo gpt-4-turbo-preview gpt-4-vision-preview gpt-4o gpt-4o-2024-05-13 hermes-2-pro-llama-3-8b llama-2-13b-chat llama-2-70b-chat llama-2-70b-chat:nitro llama-3-70b llama-3-70b-instruct llama-3-70b-instruct:nitro llama-3-8b llama-3-8b-instruct llama-3-8b-instruct:extended llama-3-8b-instruct:free llama-3-8b-instruct:nitro llama-3-lumimaid-70b llama-3-lumimaid-8b llama-3-lumimaid-8b:extended llama-3-sonar-large-32k-chat llama-3-sonar-large-32k-online llama-3-sonar-small-32k-chat llama-3-sonar-small-32k-online llama-guard-2-8b llava-13b llava-yi-34b lzlv-70b-fp16-hf midnight-rose-70b mistral-7b-instruct mistral-7b-instruct-v0.1 mistral-7b-instruct-v0.2 mistral-7b-instruct-v0.3 mistral-7b-instruct:free mistral-7b-instruct:nitro mistral-7b-openorca mistral-large mistral-medium mistral-small mistral-tiny mixtral-8x22b mixtral-8x22b-instruct mixtral-8x7b mixtral-8x7b-instruct mixtral-8x7b-instruct:nitro mythalion-13b mythomax-l2-13b mythomax-l2-13b:extended mythomax-l2-13b:nitro mythomist-7b mythomist-7b:free neural-chat-7b noromaid-20b noromaid-mixtral-8x7b-instruct nous-capybara-34b nous-capybara-7b nous-capybara-7b:free nous-hermes-2-mistral-7b-dpo nous-hermes-2-mixtral-8x7b-dpo nous-hermes-2-mixtral-8x7b-sft nous-hermes-2-vision-7b nous-hermes-llama2-13b nous-hermes-yi-34b olmo-7b-instruct openchat-7b openchat-7b:free openhermes-2-mistral-7b openhermes-2.5-mistral-7b palm-2-chat-bison palm-2-chat-bison-32k palm-2-codechat-bison palm-2-codechat-bison-32k phi-3-medium-128k-instruct phi-3-medium-128k-instruct:free phi-3-mini-128k-instruct phi-3-mini-128k-instruct:free phind-codellama-34b psyfighter-13b psyfighter-13b-2 qwen-110b-chat qwen-14b-chat qwen-32b-chat qwen-4b-chat qwen-72b-chat qwen-7b-chat remm-slerp-l2-13b remm-slerp-l2-13b:extended rwkv-5-3b-ai-town rwkv-5-world-3b snowflake-arctic-instruct soliloquy-l3 stripedhyena-hessian-7b stripedhyena-nous-7b toppy-m-7b toppy-m-7b:free toppy-m-7b:nitro weaver wizardlm-2-7b wizardlm-2-8x22b xwin-lm-70b yi-34b yi-34b-chat yi-6b zephyr-7b-beta zephyr-7b-beta:free ``` I could not even take a screenshot of the output, so I had to paste it all. --- ## Next Steps And the best part about this, is that it's all free, now not all models are free, but the entire functionality is free, you can connect even your local models if you want. I have done that as well, but on my server, not on my personal laptop (God Bless my M1). I hope now you guys are excited to try it out. So, how do you do it? We will see that in the next blog post. [[How to use Fabric| How to use Fabric?]]. Go Ahead and check it out. --- Thanks for reading. Drop a text on our WhatsApp Community if you have any doubts or question. See ya!