update doc ;perf: model test (#4098)
* perf: extract array * update doc * perf: model test * perf: model test
This commit is contained in:
parent
9bb8525909
commit
75e671bb43
@ -69,7 +69,7 @@ Project tech stack: NextJs + TS + ChakraUI + MongoDB + PostgreSQL (PG Vector plu
|
||||
|
||||
> When using [Sealos](https://sealos.io) services, there is no need to purchase servers or domain names. It supports high concurrency and dynamic scaling, and the database application uses the kubeblocks database, which far exceeds the simple Docker container deployment in terms of IO performance.
|
||||
<div align="center">
|
||||
[](https://cloud.sealos.io/?openapp=system-fastdeploy%3FtemplateName%3Dfastgpt)
|
||||
[](https://cloud.sealos.io/?openapp=system-fastdeploy%3FtemplateName%3Dfastgpt&uid=fnWRt09fZP)
|
||||
</div>
|
||||
|
||||
Give it a 2-4 minute wait after deployment as it sets up the database. Initially, it might be a too slow since we're using the basic settings.
|
||||
|
||||
@ -94,7 +94,7 @@ https://github.com/labring/FastGPT/assets/15308462/7d3a38df-eb0e-4388-9250-2409b
|
||||
|
||||
- **⚡ デプロイ**
|
||||
|
||||
[](https://cloud.sealos.io/?openapp=system-fastdeploy%3FtemplateName%3Dfastgpt)
|
||||
[](https://cloud.sealos.io/?openapp=system-fastdeploy%3FtemplateName%3Dfastgpt&uid=fnWRt09fZP)
|
||||
|
||||
デプロイ 後、データベースをセットアップするので、2~4分待 ってください。基本設定 を 使 っているので、最初 は 少 し 遅 いかもしれません。
|
||||
|
||||
|
||||
@ -44,7 +44,7 @@ weight: 707
|
||||
|
||||
#### 1. 申请 Sealos AI proxy API Key
|
||||
|
||||
[点击打开 Sealos Pdf parser 官网](https://cloud.sealos.run/?uid=fnWRt09fZP&openapp=system-aiproxy),并进行对应 API Key 的申请。
|
||||
[点击打开 Sealos Pdf parser 官网](https://hzh.sealos.run/?uid=fnWRt09fZP&openapp=system-aiproxy),并进行对应 API Key 的申请。
|
||||
|
||||
#### 2. 修改 FastGPT 配置文件
|
||||
|
||||
|
||||
@ -29,7 +29,7 @@ weight: 744
|
||||
|
||||
{{% alert icon=" " context="info" %}}
|
||||
- [SiliconCloud(硅基流动)](https://cloud.siliconflow.cn/i/TR9Ym0c4): 提供开源模型调用的平台。
|
||||
- [Sealos AIProxy](https://cloud.sealos.run/?uid=fnWRt09fZP&openapp=system-aiproxy): 提供国内各家模型代理,无需逐一申请 api。
|
||||
- [Sealos AIProxy](https://hzh.sealos.run/?uid=fnWRt09fZP&openapp=system-aiproxy): 提供国内各家模型代理,无需逐一申请 api。
|
||||
{{% /alert %}}
|
||||
|
||||
在 OneAPI 配置好模型后,你就可以打开 FastGPT 页面,启用对应模型了。
|
||||
|
||||
@ -23,7 +23,7 @@ FastGPT 目前采用模型分离的部署方案,FastGPT 中只兼容 OpenAI
|
||||
### Sealos 版本
|
||||
|
||||
* 北京区: [点击部署 OneAPI](https://hzh.sealos.run/?openapp=system-template%3FtemplateName%3Done-api)
|
||||
* 新加坡区(可用 GPT) [点击部署 OneAPI](https://cloud.sealos.io/?openapp=system-template%3FtemplateName%3Done-api)
|
||||
* 新加坡区(可用 GPT) [点击部署 OneAPI](https://cloud.sealos.io/?openapp=system-template%3FtemplateName%3Done-api&uid=fnWRt09fZP)
|
||||
|
||||

|
||||
|
||||
|
||||
@ -9,7 +9,7 @@ weight: 951
|
||||
|
||||
## 登录 Sealos
|
||||
|
||||
[Sealos](https://cloud.sealos.io/)
|
||||
[Sealos](https://cloud.sealos.io?uid=fnWRt09fZP)
|
||||
|
||||
## 创建应用
|
||||
|
||||
|
||||
@ -26,13 +26,13 @@ FastGPT 使用了 one-api 项目来管理模型池,其可以兼容 OpenAI 、A
|
||||
|
||||
新加披区的服务器在国外,可以直接访问 OpenAI,但国内用户需要梯子才可以正常访问新加坡区。国际区价格稍贵,点击下面按键即可部署👇
|
||||
|
||||
<a href="https://template.cloud.sealos.io/deploy?templateName=fastgpt" rel="external" target="_blank"><img src="https://cdn.jsdelivr.net/gh/labring-actions/templates@main/Deploy-on-Sealos.svg" alt="Deploy on Sealos"/></a>
|
||||
<a href="https://template.cloud.sealos.io/deploy?templateName=fastgpt&uid=fnWRt09fZP" rel="external" target="_blank"><img src="https://cdn.jsdelivr.net/gh/labring-actions/templates@main/Deploy-on-Sealos.svg" alt="Deploy on Sealos"/></a>
|
||||
|
||||
### 北京区
|
||||
|
||||
北京区服务提供商为火山云,国内用户可以稳定访问,但无法访问 OpenAI 等境外服务,价格约为新加坡区的 1/4。点击下面按键即可部署👇
|
||||
|
||||
<a href="https://bja.sealos.run/?openapp=system-template%3FtemplateName%3Dfastgpt" rel="external" target="_blank"><img src="https://raw.githubusercontent.com/labring-actions/templates/main/Deploy-on-Sealos.svg" alt="Deploy on Sealos"/></a>
|
||||
<a href="https://bja.sealos.run/?openapp=system-template%3FtemplateName%3Dfastgpt&uid=fnWRt09fZP" rel="external" target="_blank"><img src="https://raw.githubusercontent.com/labring-actions/templates/main/Deploy-on-Sealos.svg" alt="Deploy on Sealos"/></a>
|
||||
|
||||
### 1. 开始部署
|
||||
|
||||
|
||||
@ -13,7 +13,7 @@ FastGPT V4.5 引入 PgVector0.5 版本的 HNSW 索引,极大的提高了知识
|
||||
|
||||
## PgVector升级:Sealos 部署方案
|
||||
|
||||
1. 点击[Sealos桌面](https://cloud.sealos.io)的数据库应用。
|
||||
1. 点击[Sealos桌面](https://cloud.sealos.io?uid=fnWRt09fZP)的数据库应用。
|
||||
2. 点击【pg】数据库的详情。
|
||||
3. 点击右上角的重启,等待重启完成。
|
||||
4. 点击左侧的一键链接,等待打开 Terminal。
|
||||
|
||||
@ -9,16 +9,20 @@ weight: 799
|
||||
|
||||
## 🚀 新增内容
|
||||
|
||||
1. 商业版支持单团队模式,更好的管理内部成员。
|
||||
|
||||
## ⚙️ 优化
|
||||
|
||||
1. 知识库数据输入框交互
|
||||
2. 应用拉取绑定知识库数据交由后端处理。
|
||||
3. 增加依赖包安全版本检测,并升级部分依赖包。
|
||||
4. 模型测试代码。
|
||||
|
||||
## 🐛 修复
|
||||
|
||||
1. 最大响应 tokens 提示显示错误的问题。
|
||||
2. HTTP Node 中,字符串包含换行符时,会解析失败。
|
||||
3. 知识库问题优化中,未传递历史记录。
|
||||
4. 错误提示翻译缺失。
|
||||
4. 错误提示翻译缺失。
|
||||
5. 内容提取节点,array 类型 schema 错误。
|
||||
6. 模型渠道测试时,实际未指定渠道测试。
|
||||
@ -30,7 +30,7 @@ FastGPT 升级包括两个步骤:
|
||||
|
||||
## Sealos 修改镜像
|
||||
|
||||
1. 打开 [Sealos Cloud](https://cloud.sealos.io/), 找到桌面上的应用管理
|
||||
1. 打开 [Sealos Cloud](https://cloud.sealos.io?uid=fnWRt09fZP), 找到桌面上的应用管理
|
||||
|
||||

|
||||
|
||||
|
||||
@ -14,7 +14,7 @@ weight: 303
|
||||
|
||||
这里介绍在 Sealos 中部署 SearXNG 的方法。Docker 部署,可以直接参考 [SearXNG 官方教程](https://github.com/searxng/searxng)。
|
||||
|
||||
点击打开 [Sealos 北京区](https://bja.sealos.run/),点击应用部署,并新建一个应用:
|
||||
点击打开 [Sealos 北京区](https://bja.sealos.run?uid=fnWRt09fZP),点击应用部署,并新建一个应用:
|
||||
|
||||
| 打开应用部署 | 点击新建应用 |
|
||||
| --- | --- |
|
||||
@ -130,7 +130,7 @@ doi_resolvers:
|
||||
default_doi_resolver: 'oadoi.org'
|
||||
```
|
||||
|
||||
国内目前只有 Bing 引擎可以正常用,所以上面的配置只配置了 bing 引擎。如果在海外部署,可以使用[Sealos 新加坡可用区](https://cloud.sealos.io/),并配置其他搜索引擎,可以参考[SearXNG 默认配置文件](https://github.com/searxng/searxng/blob/master/searx/settings.yml), 从里面复制一些 engine 配置。例如:
|
||||
国内目前只有 Bing 引擎可以正常用,所以上面的配置只配置了 bing 引擎。如果在海外部署,可以使用[Sealos 新加坡可用区](https://cloud.sealos.io?uid=fnWRt09fZP),并配置其他搜索引擎,可以参考[SearXNG 默认配置文件](https://github.com/searxng/searxng/blob/master/searx/settings.yml), 从里面复制一些 engine 配置。例如:
|
||||
|
||||
```
|
||||
- name: duckduckgo
|
||||
|
||||
@ -27,7 +27,7 @@ weight: 510
|
||||
|
||||
## sealos部署服务
|
||||
|
||||
[访问sealos](https://cloud.sealos.run/) 登录进来之后打开「应用管理」-> 「新建应用」。
|
||||
[访问sealos](https://hzh.sealos.run?uid=fnWRt09fZP) 登录进来之后打开「应用管理」-> 「新建应用」。
|
||||
- 应用名:称随便填写
|
||||
- 镜像名:私人微信填写 aibotk/wechat-assistant 企业微信填写 aibotk/worker-assistant
|
||||
- cpu和内存建议 1c1g
|
||||
|
||||
@ -4,7 +4,10 @@ export type ContextExtractAgentItemType = {
|
||||
valueType:
|
||||
| WorkflowIOValueTypeEnum.string
|
||||
| WorkflowIOValueTypeEnum.number
|
||||
| WorkflowIOValueTypeEnum.boolean;
|
||||
| WorkflowIOValueTypeEnum.boolean
|
||||
| WorkflowIOValueTypeEnum.arrayString
|
||||
| WorkflowIOValueTypeEnum.arrayNumber
|
||||
| WorkflowIOValueTypeEnum.arrayBoolean;
|
||||
desc: string;
|
||||
key: string;
|
||||
required: boolean;
|
||||
|
||||
@ -6,10 +6,12 @@ import { getSTTModel } from '../model';
|
||||
|
||||
export const aiTranscriptions = async ({
|
||||
model,
|
||||
fileStream
|
||||
fileStream,
|
||||
headers
|
||||
}: {
|
||||
model: string;
|
||||
fileStream: fs.ReadStream;
|
||||
headers?: Record<string, string>;
|
||||
}) => {
|
||||
const data = new FormData();
|
||||
data.append('model', model);
|
||||
@ -30,7 +32,8 @@ export const aiTranscriptions = async ({
|
||||
Authorization: modelData.requestAuth
|
||||
? `Bearer ${modelData.requestAuth}`
|
||||
: aiAxiosConfig.authorization,
|
||||
...data.getHeaders()
|
||||
...data.getHeaders(),
|
||||
...headers
|
||||
},
|
||||
data: data
|
||||
});
|
||||
|
||||
@ -76,7 +76,7 @@
|
||||
{
|
||||
"model": "qwen-max",
|
||||
"name": "Qwen-max",
|
||||
"maxContext": 8000,
|
||||
"maxContext": 32000,
|
||||
"maxResponse": 4000,
|
||||
"quoteMaxToken": 6000,
|
||||
"maxTemperature": 1,
|
||||
|
||||
@ -8,10 +8,11 @@ type GetVectorProps = {
|
||||
model: EmbeddingModelItemType;
|
||||
input: string;
|
||||
type?: `${EmbeddingTypeEnm}`;
|
||||
headers?: Record<string, string>;
|
||||
};
|
||||
|
||||
// text to vector
|
||||
export async function getVectorsByText({ model, input, type }: GetVectorProps) {
|
||||
export async function getVectorsByText({ model, input, type, headers }: GetVectorProps) {
|
||||
if (!input) {
|
||||
return Promise.reject({
|
||||
code: 500,
|
||||
@ -37,9 +38,10 @@ export async function getVectorsByText({ model, input, type }: GetVectorProps) {
|
||||
path: model.requestUrl,
|
||||
headers: model.requestAuth
|
||||
? {
|
||||
Authorization: `Bearer ${model.requestAuth}`
|
||||
Authorization: `Bearer ${model.requestAuth}`,
|
||||
...headers
|
||||
}
|
||||
: undefined
|
||||
: headers
|
||||
}
|
||||
: {}
|
||||
)
|
||||
|
||||
@ -16,11 +16,13 @@ type ReRankCallResult = { id: string; score?: number }[];
|
||||
export function reRankRecall({
|
||||
model = getDefaultRerankModel(),
|
||||
query,
|
||||
documents
|
||||
documents,
|
||||
headers
|
||||
}: {
|
||||
model?: ReRankModelItemType;
|
||||
query: string;
|
||||
documents: { id: string; text: string }[];
|
||||
headers?: Record<string, string>;
|
||||
}): Promise<ReRankCallResult> {
|
||||
if (!model) {
|
||||
return Promise.reject('no rerank model');
|
||||
@ -41,7 +43,8 @@ export function reRankRecall({
|
||||
},
|
||||
{
|
||||
headers: {
|
||||
Authorization: model.requestAuth ? `Bearer ${model.requestAuth}` : authorization
|
||||
Authorization: model.requestAuth ? `Bearer ${model.requestAuth}` : authorization,
|
||||
...headers
|
||||
},
|
||||
timeout: 30000
|
||||
}
|
||||
|
||||
@ -12,12 +12,12 @@ export async function listAppDatasetDataByTeamIdAndDatasetIds({
|
||||
datasetIdList: string[];
|
||||
}) {
|
||||
const myDatasets = await MongoDataset.find({
|
||||
teamId,
|
||||
_id: { $in: datasetIdList }
|
||||
_id: { $in: datasetIdList },
|
||||
...(teamId && { teamId })
|
||||
}).lean();
|
||||
|
||||
return myDatasets.map((item) => ({
|
||||
datasetId: item._id,
|
||||
datasetId: String(item._id),
|
||||
avatar: item.avatar,
|
||||
name: item.name,
|
||||
vectorModel: getEmbeddingModel(item.vectorModel)
|
||||
@ -47,7 +47,7 @@ export async function rewriteAppWorkflowToDetail({
|
||||
|
||||
const datasetIds = Array.isArray(rawValue)
|
||||
? rawValue.map((v) => v?.datasetId).filter((id) => !!id && typeof id === 'string')
|
||||
: rawValue.datasetId
|
||||
: rawValue?.datasetId
|
||||
? [String(rawValue.datasetId)]
|
||||
: [];
|
||||
|
||||
@ -61,38 +61,63 @@ export async function rewriteAppWorkflowToDetail({
|
||||
teamId: isRoot ? undefined : teamId,
|
||||
datasetIdList: Array.from(datasetIdSet)
|
||||
});
|
||||
|
||||
const datasetMap = new Map(datasetList.map((ds) => [String(ds.datasetId), ds]));
|
||||
|
||||
// Rewrite dataset ids, add dataset info to nodes
|
||||
nodes.forEach((node) => {
|
||||
if (node.flowNodeType !== FlowNodeTypeEnum.datasetSearchNode) return;
|
||||
if (datasetList.length > 0) {
|
||||
nodes.forEach((node) => {
|
||||
if (node.flowNodeType !== FlowNodeTypeEnum.datasetSearchNode) return;
|
||||
|
||||
node.inputs.forEach((item) => {
|
||||
if (item.key !== NodeInputKeyEnum.datasetSelectList) return;
|
||||
node.inputs.forEach((item) => {
|
||||
if (item.key !== NodeInputKeyEnum.datasetSelectList) return;
|
||||
|
||||
const val = item.value as undefined | { datasetId: string }[] | { datasetId: string };
|
||||
const val = item.value as undefined | { datasetId: string }[] | { datasetId: string };
|
||||
|
||||
if (Array.isArray(val)) {
|
||||
item.value = val.map((v) => {
|
||||
const data = datasetMap.get(String(v.datasetId))!;
|
||||
return {
|
||||
datasetId: data.datasetId,
|
||||
avatar: data.avatar,
|
||||
name: data.name,
|
||||
vectorModel: data.vectorModel
|
||||
};
|
||||
});
|
||||
} else if (typeof val === 'object' && val !== null) {
|
||||
const data = datasetMap.get(String(val.datasetId))!;
|
||||
item.value = {
|
||||
datasetId: data.datasetId,
|
||||
avatar: data.avatar,
|
||||
name: data.name,
|
||||
vectorModel: data.vectorModel
|
||||
};
|
||||
}
|
||||
if (Array.isArray(val)) {
|
||||
item.value = val
|
||||
.map((v) => {
|
||||
const data = datasetMap.get(String(v.datasetId));
|
||||
if (!data)
|
||||
return {
|
||||
datasetId: v.datasetId,
|
||||
avatar: '',
|
||||
name: 'Dataset not found',
|
||||
vectorModel: ''
|
||||
};
|
||||
return {
|
||||
datasetId: data.datasetId,
|
||||
avatar: data.avatar,
|
||||
name: data.name,
|
||||
vectorModel: data.vectorModel
|
||||
};
|
||||
})
|
||||
.filter(Boolean);
|
||||
} else if (typeof val === 'object' && val !== null) {
|
||||
const data = datasetMap.get(String(val.datasetId));
|
||||
if (!data) {
|
||||
item.value = [
|
||||
{
|
||||
datasetId: val.datasetId,
|
||||
avatar: '',
|
||||
name: 'Dataset not found',
|
||||
vectorModel: ''
|
||||
}
|
||||
];
|
||||
} else {
|
||||
item.value = [
|
||||
{
|
||||
datasetId: data.datasetId,
|
||||
avatar: data.avatar,
|
||||
name: data.name,
|
||||
vectorModel: data.vectorModel
|
||||
}
|
||||
];
|
||||
}
|
||||
}
|
||||
});
|
||||
});
|
||||
});
|
||||
}
|
||||
|
||||
return nodes;
|
||||
}
|
||||
|
||||
@ -202,7 +202,7 @@ ${description ? `- ${description}` : ''}
|
||||
properties[item.key] = {
|
||||
...jsonSchema,
|
||||
description: item.desc,
|
||||
...(item.enum ? { enum: item.enum.split('\n') } : {})
|
||||
...(item.enum ? { enum: item.enum.split('\n').filter(Boolean) } : {})
|
||||
};
|
||||
});
|
||||
// function body
|
||||
|
||||
@ -21,7 +21,7 @@ import {
|
||||
FlowNodeInputTypeEnum,
|
||||
FlowNodeTypeEnum
|
||||
} from '@fastgpt/global/core/workflow/node/constant';
|
||||
import { getNanoid, replaceVariable } from '@fastgpt/global/common/string/tools';
|
||||
import { getNanoid } from '@fastgpt/global/common/string/tools';
|
||||
import { getSystemTime } from '@fastgpt/global/common/time/timezone';
|
||||
|
||||
import { dispatchWorkflowStart } from './init/workflowStart';
|
||||
@ -426,6 +426,14 @@ export async function dispatchWorkFlow(data: Props): Promise<DispatchFlowRespons
|
||||
})();
|
||||
|
||||
if (!nodeRunResult) return [];
|
||||
if (res?.closed) {
|
||||
addLog.warn('Request is closed', {
|
||||
appId: props.runningAppInfo.id,
|
||||
nodeId: node.nodeId,
|
||||
nodeName: node.name
|
||||
});
|
||||
return [];
|
||||
}
|
||||
|
||||
/*
|
||||
特殊情况:
|
||||
|
||||
@ -35,20 +35,26 @@ async function handler(
|
||||
|
||||
if (!modelData) return Promise.reject('Model not found');
|
||||
|
||||
const headers: Record<string, string> = channelId
|
||||
? {
|
||||
'Aiproxy-Channel': channelId
|
||||
}
|
||||
: {};
|
||||
|
||||
if (modelData.type === 'llm') {
|
||||
return testLLMModel(modelData);
|
||||
return testLLMModel(modelData, headers);
|
||||
}
|
||||
if (modelData.type === 'embedding') {
|
||||
return testEmbeddingModel(modelData);
|
||||
return testEmbeddingModel(modelData, headers);
|
||||
}
|
||||
if (modelData.type === 'tts') {
|
||||
return testTTSModel(modelData);
|
||||
return testTTSModel(modelData, headers);
|
||||
}
|
||||
if (modelData.type === 'stt') {
|
||||
return testSTTModel(modelData);
|
||||
return testSTTModel(modelData, headers);
|
||||
}
|
||||
if (modelData.type === 'rerank') {
|
||||
return testReRankModel(modelData);
|
||||
return testReRankModel(modelData, headers);
|
||||
}
|
||||
|
||||
return Promise.reject('Model type not supported');
|
||||
@ -56,7 +62,7 @@ async function handler(
|
||||
|
||||
export default NextAPI(handler);
|
||||
|
||||
const testLLMModel = async (model: LLMModelItemType) => {
|
||||
const testLLMModel = async (model: LLMModelItemType, headers: Record<string, string>) => {
|
||||
const ai = getAIApi({
|
||||
timeout: 10000
|
||||
});
|
||||
@ -65,7 +71,7 @@ const testLLMModel = async (model: LLMModelItemType) => {
|
||||
{
|
||||
model: model.model,
|
||||
messages: [{ role: 'user', content: 'hi' }],
|
||||
stream: false
|
||||
stream: true
|
||||
},
|
||||
model
|
||||
);
|
||||
@ -73,30 +79,38 @@ const testLLMModel = async (model: LLMModelItemType) => {
|
||||
...(model.requestUrl ? { path: model.requestUrl } : {}),
|
||||
headers: model.requestAuth
|
||||
? {
|
||||
Authorization: `Bearer ${model.requestAuth}`
|
||||
Authorization: `Bearer ${model.requestAuth}`,
|
||||
...headers
|
||||
}
|
||||
: undefined
|
||||
: headers
|
||||
});
|
||||
|
||||
const responseText = response.choices?.[0]?.message?.content;
|
||||
// @ts-ignore
|
||||
const reasoning_content = response.choices?.[0]?.message?.reasoning_content;
|
||||
|
||||
if (!responseText && !reasoning_content) {
|
||||
return Promise.reject('Model response empty');
|
||||
for await (const part of response) {
|
||||
const content = part.choices?.[0]?.delta?.content || '';
|
||||
// @ts-ignore
|
||||
const reasoningContent = part.choices?.[0]?.delta?.reasoning_content || '';
|
||||
if (content || reasoningContent) {
|
||||
response?.controller?.abort();
|
||||
return;
|
||||
}
|
||||
}
|
||||
addLog.info(`Model not stream response`);
|
||||
|
||||
addLog.info(`Model test response: ${responseText}`);
|
||||
return Promise.reject('Model response empty');
|
||||
};
|
||||
|
||||
const testEmbeddingModel = async (model: EmbeddingModelItemType) => {
|
||||
const testEmbeddingModel = async (
|
||||
model: EmbeddingModelItemType,
|
||||
headers: Record<string, string>
|
||||
) => {
|
||||
return getVectorsByText({
|
||||
input: 'Hi',
|
||||
model
|
||||
model,
|
||||
headers
|
||||
});
|
||||
};
|
||||
|
||||
const testTTSModel = async (model: TTSModelType) => {
|
||||
const testTTSModel = async (model: TTSModelType, headers: Record<string, string>) => {
|
||||
const ai = getAIApi({
|
||||
timeout: 10000
|
||||
});
|
||||
@ -113,27 +127,30 @@ const testTTSModel = async (model: TTSModelType) => {
|
||||
path: model.requestUrl,
|
||||
headers: model.requestAuth
|
||||
? {
|
||||
Authorization: `Bearer ${model.requestAuth}`
|
||||
Authorization: `Bearer ${model.requestAuth}`,
|
||||
...headers
|
||||
}
|
||||
: undefined
|
||||
: headers
|
||||
}
|
||||
: {}
|
||||
);
|
||||
};
|
||||
|
||||
const testSTTModel = async (model: STTModelType) => {
|
||||
const testSTTModel = async (model: STTModelType, headers: Record<string, string>) => {
|
||||
const path = isProduction ? '/app/data/test.mp3' : 'data/test.mp3';
|
||||
const { text } = await aiTranscriptions({
|
||||
model: model.model,
|
||||
fileStream: fs.createReadStream(path)
|
||||
fileStream: fs.createReadStream(path),
|
||||
headers
|
||||
});
|
||||
addLog.info(`STT result: ${text}`);
|
||||
};
|
||||
|
||||
const testReRankModel = async (model: ReRankModelItemType) => {
|
||||
const testReRankModel = async (model: ReRankModelItemType, headers: Record<string, string>) => {
|
||||
await reRankRecall({
|
||||
model,
|
||||
query: 'Hi',
|
||||
documents: [{ id: '1', text: 'Hi' }]
|
||||
documents: [{ id: '1', text: 'Hi' }],
|
||||
headers
|
||||
});
|
||||
};
|
||||
|
||||
Loading…
x
Reference in New Issue
Block a user