add gpt-prompt-engineer

This commit is contained in:
yzfly
2023-07-14 01:26:12 +08:00
parent f3dd3797f0
commit aba7aa0ae9
3 changed files with 32 additions and 0 deletions

View File

@ -256,6 +256,8 @@ https://github.com/whoiskatrin/chart-gpt
|[WebGPT](https://github.com/0hq/WebGPT)|![GitHub Repo stars](https://badgen.net/github/stars/0hq/WebGPT)|WebGPT 是基于浏览器 WebGPU 能力打造的在流量器运行 GPT 模型的应用|未来可期~|
|[PentestGPT](https://github.com/GreyDGL/PentestGPT)|![GitHub Repo stars](https://badgen.net/github/stars/GreyDGL/PentestGPT)|基于 GPT 能力的渗透测试工具|-|
|[ChatGPT.nvim](https://github.com/jackMort/ChatGPT.nvim)|![GitHub Repo stars](https://badgen.net/github/stars/jackMort/ChatGPT.nvim)|ChatGPT Neovim Plugin: Effortless Natural Language Generation with OpenAI's ChatGPT API.|-|
|[assafelovic/gpt-researcher](https://github.com/assafelovic/gpt-researcher)|![GitHub Repo stars](https://badgen.net/github/stars/assafelovic/gpt-researcher)|GPT based autonomous agent that does online comprehensive research on any given topic|-|
|[SkalskiP/awesome-chatgpt-code-interpreter-experiments](https://github.com/SkalskiP/awesome-chatgpt-code-interpreter-experiments)|![GitHub Repo stars](https://badgen.net/github/stars/SkalskiP/awesome-chatgpt-code-interpreter-experiments)|Awesome things you can do with ChatGPT + Code Interpreter combo 🔥|-|
#### [OpenGPT](https://open-gpt.app/)

View File

@ -24,6 +24,7 @@ OpenAI 的 ChatGPT 大型语言模型LLM并未开源这部分收录一
|[FreedomGPT](https://github.com/ohmplatform/FreedomGPT) |![GitHub Repo stars](https://badgen.net/github/stars/ohmplatform/FreedomGPT)|-|自由无限制的可以在 windows 和 mac 上本地运行的 GPT基于 Alpaca Lora 模型。|
|[FinGPT](https://github.com/AI4Finance-Foundation/FinGPT)|![GitHub Repo stars](https://badgen.net/github/stars/AI4Finance-Foundation/FinGPT)|Data-Centric FinGPT. Open-source for open finance! Revolutionize 🔥 We'll soon release the trained model.|金融领域大模型|
|[baichuan-7B](https://github.com/baichuan-inc/baichuan-7B) |![GitHub Repo stars](https://badgen.net/github/stars/baichuan-inc/baichuan-7B)|A large-scale 7B pretraining language model developed by Baichuan |baichuan-7B 是由百川智能开发的一个开源可商用的大规模预训练语言模型。基于 Transformer 结构在大约1.2万亿 tokens 上训练的70亿参数模型支持中英双语上下文窗口长度为4096。在标准的中文和英文权威 benchmarkC-EVAL/MMLU上均取得同尺寸最好的效果。|
|[baichuan-inc/Baichuan-13B](https://github.com/baichuan-inc/Baichuan-13B)|![GitHub Repo stars](https://badgen.net/github/stars/baichuan-inc/Baichuan-13B)|A 13B large language model developed by Baichuan Intelligent Technology|-|
|[open_llama](https://github.com/openlm-research/open_llama) |![GitHub Repo stars](https://badgen.net/github/stars/openlm-research/open_llama)|OpenLLaMA, a permissively licensed open source reproduction of Meta AIs LLaMA 7B trained on the RedPajama dataset. |OpenLLaMA允许开源复制Meta AI的LLaMA-7B 模型在red睡衣数据集上训练得到。|
### 大模型训练和微调