Nothing Special   »   [go: up one dir, main page]

Skip to content

Instantly share code, notes, and snippets.

View iamaziz's full-sized avatar
🎲

Aziz Alto iamaziz

🎲
View GitHub Profile
@iamaziz
iamaziz / df2fhtml.py
Created August 24, 2024 15:11
Convert pd.dataframe df to FastHTML table. Similar to `df.to_html()` but different
from fasthtml import common as fh
def df2fhtml(df: pd.DataFrame, with_headers: bool=True, **kwargs):
cols = df.columns
header = fh.Tr(*[fh.Th(fh.Label(col)) for col in cols])
rows = [fh.Tr(*[fh.Td(df[col][i]) for col in cols]) for i in range(len(df))]
return fh.Table(header if with_headers else '', *rows, **kwargs)
@iamaziz
iamaziz / dspy_ollama_getting_started.ipynb
Last active July 30, 2024 15:01
Getting started with DSPy and Ollama
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
import torch
from langchain_community.embeddings import (
OpenAIEmbeddings,
OllamaEmbeddings,
HuggingFaceEmbeddings,
HuggingFaceBgeEmbeddings
)
def embedding_func(selected_embedding: str = "HuggingFaceEmbeddings"):
"""
@iamaziz
iamaziz / ollachat_ar.py
Last active August 12, 2024 08:51
Ollama UI for Arabic (right-to-left) based on: https://github.com/iamaziz/ollachat (example in the comments)
import os
import json
import datetime
import streamlit as st
from llama_index.llms import Ollama
from llama_index.llms import ChatMessage
# https://docs.llamaindex.ai/en/stable/examples/llm/ollama.html
@iamaziz
iamaziz / code_completion_ide.py
Last active February 25, 2024 02:20
simple offline code completion example with ollama/streamlit and code execution
import sys
from io import StringIO
import streamlit as st # pip install streamlit
from code_editor import code_editor # pip install streamlit_code_editor
import ollama as ol # pip install ollama
st.set_page_config(layout='wide')
st.title('`Offline code completion`')
@iamaziz
iamaziz / arabic_llm.ipynb
Last active February 22, 2024 07:47
Out-of-the-box open LLM generating Arabic text
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
@iamaziz
iamaziz / custom_arabic_llm.ipynb
Created February 22, 2024 07:08
Custom open `local` LLMs (Ollama) for Arabic language
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
@iamaziz
iamaziz / custom_arabic_llm.ipynb
Last active February 22, 2024 07:07
Custom open `local` LLMs (Ollama) for Arabic language
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
@iamaziz
iamaziz / fast_speech_text_speech_steramlit_app.py
Last active May 28, 2024 08:08 — forked from thomwolf/fast_speech_text_speech.py
speech to text to speech (Srteamlit App) - example in the comments
""" To use: install Ollama (or LLM studio), clone OpenVoice, run this script in the OpenVoice directory
git clone https://github.com/myshell-ai/OpenVoice
cd OpenVoice
git clone https://huggingface.co/myshell-ai/OpenVoice
cp -r OpenVoice/* .
pip install whisper pynput pyaudio streamlit ollama
script source: https://x.com/Thom_Wolf/status/1758140066285658351?s=20
"""

Using local LLMs anywhere (in text editor) - example below with Obsidian

inspired by and adopted from LLM-automator.

Code example with mixtral

image
automator-llm-code.mp4