perf: api dataset code
|
Before Width: | Height: | Size: 97 KiB After Width: | Height: | Size: 42 KiB |
|
Before Width: | Height: | Size: 14 KiB After Width: | Height: | Size: 6.0 KiB |
|
Before Width: | Height: | Size: 151 KiB After Width: | Height: | Size: 64 KiB |
|
Before Width: | Height: | Size: 169 KiB After Width: | Height: | Size: 73 KiB |
|
Before Width: | Height: | Size: 145 KiB After Width: | Height: | Size: 62 KiB |
|
Before Width: | Height: | Size: 61 KiB After Width: | Height: | Size: 26 KiB |
|
Before Width: | Height: | Size: 74 KiB After Width: | Height: | Size: 29 KiB |
|
Before Width: | Height: | Size: 77 KiB After Width: | Height: | Size: 33 KiB |
|
Before Width: | Height: | Size: 90 KiB After Width: | Height: | Size: 49 KiB |
|
Before Width: | Height: | Size: 159 KiB After Width: | Height: | Size: 69 KiB |
|
Before Width: | Height: | Size: 128 KiB After Width: | Height: | Size: 40 KiB |
|
Before Width: | Height: | Size: 95 KiB After Width: | Height: | Size: 40 KiB |
|
Before Width: | Height: | Size: 151 KiB After Width: | Height: | Size: 66 KiB |
|
Before Width: | Height: | Size: 129 KiB After Width: | Height: | Size: 57 KiB |
|
Before Width: | Height: | Size: 180 KiB After Width: | Height: | Size: 78 KiB |
|
Before Width: | Height: | Size: 231 KiB After Width: | Height: | Size: 103 KiB |
|
Before Width: | Height: | Size: 77 KiB After Width: | Height: | Size: 43 KiB |
|
Before Width: | Height: | Size: 95 KiB After Width: | Height: | Size: 41 KiB |
|
Before Width: | Height: | Size: 68 KiB After Width: | Height: | Size: 35 KiB |
|
Before Width: | Height: | Size: 94 KiB After Width: | Height: | Size: 38 KiB |
|
Before Width: | Height: | Size: 72 KiB After Width: | Height: | Size: 28 KiB |
|
Before Width: | Height: | Size: 150 KiB After Width: | Height: | Size: 64 KiB |
|
Before Width: | Height: | Size: 238 KiB After Width: | Height: | Size: 110 KiB |
@ -1,5 +1,5 @@
|
|||||||
---
|
---
|
||||||
title: 'V4.9.1'
|
title: 'V4.9.1(包含升级脚本)'
|
||||||
description: 'FastGPT V4.9.1 更新说明'
|
description: 'FastGPT V4.9.1 更新说明'
|
||||||
icon: 'upgrade'
|
icon: 'upgrade'
|
||||||
draft: false
|
draft: false
|
||||||
|
|||||||
@ -7,6 +7,21 @@ toc: true
|
|||||||
weight: 789
|
weight: 789
|
||||||
---
|
---
|
||||||
|
|
||||||
|
## 执行升级脚本
|
||||||
|
|
||||||
|
该脚本仅需商业版用户执行。
|
||||||
|
|
||||||
|
从任意终端,发起 1 个 HTTP 请求。其中 {{rootkey}} 替换成环境变量里的 `rootkey`;{{host}} 替换成**FastGPT 域名**。
|
||||||
|
|
||||||
|
```bash
|
||||||
|
curl --location --request POST 'https://{{host}}/api/admin/initv4911' \
|
||||||
|
--header 'rootkey: {{rootkey}}' \
|
||||||
|
--header 'Content-Type: application/json'
|
||||||
|
```
|
||||||
|
|
||||||
|
**脚本功能**
|
||||||
|
|
||||||
|
1. 移动第三方知识库 API 配置。
|
||||||
|
|
||||||
## 🚀 新增内容
|
## 🚀 新增内容
|
||||||
|
|
||||||
|
|||||||
@ -1,5 +1,5 @@
|
|||||||
---
|
---
|
||||||
title: 'V4.9.4'
|
title: 'V4.9.4(包含升级脚本)'
|
||||||
description: 'FastGPT V4.9.4 更新说明'
|
description: 'FastGPT V4.9.4 更新说明'
|
||||||
icon: 'upgrade'
|
icon: 'upgrade'
|
||||||
draft: false
|
draft: false
|
||||||
|
|||||||
@ -7,142 +7,55 @@ toc: true
|
|||||||
weight: 410
|
weight: 410
|
||||||
---
|
---
|
||||||
|
|
||||||
目前,互联网上拥有各种各样的文档库,例如飞书,语雀等等。 FastGPT 的不同用户可能使用的文档库不同,然而开发人手不够,FastGPT 目前只支持飞书,语雀,api ,web 站点这几个知识库。为了满足广大用户对其他知识库需求,同时增强开源性,现在教学如何自己开发第三方知识库。
|
目前,互联网上拥有各种各样的文档库,例如飞书,语雀等等。 FastGPT 的不同用户可能使用的文档库不同,目前 FastGPT 内置了飞书、语雀文档库,如果需要接入其他文档库,可以参考本节内容。
|
||||||
|
|
||||||
## 准备本地开发环境
|
|
||||||
|
|
||||||
想要开发 FastGPT ,首先要拥有本地开发环境,具体参考[快速开始本地开发](../../development/intro.md)
|
## 统一的接口规范
|
||||||
|
|
||||||
## 开始开发
|
为了实现对不同文档库的统一接入,FastGPT 对第三方文档库进行了接口的规范,共包含 4 个接口内容,可以[查看 API 文件库接口](/docs/guide/knowledge_base/api_datase)。
|
||||||
|
|
||||||
|
所有内置的文档库,都是基于标准的 API 文件库进行扩展。可以参考`FastGPT/packages/service/core/dataset/apiDataset/yuqueDataset/api.ts`中的代码,进行其他文档库的扩展。一共需要完成 4 个接口开发:
|
||||||
|
|
||||||
|
1. 获取文件列表
|
||||||
|
2. 获取文件内容/文件链接
|
||||||
|
3. 获取原文预览地址
|
||||||
|
4. 获取文件详情信息
|
||||||
|
|
||||||
|
## 开始一个第三方文件库
|
||||||
|
|
||||||
为了方便讲解,这里以添加飞书知识库为例。
|
为了方便讲解,这里以添加飞书知识库为例。
|
||||||
|
|
||||||
首先,要进入 FastGPT 项目路径下的`FastGPT\packages\global\core\dataset\apiDataset.d.ts`文件,添加自己的知识库 Server 类型。
|
### 1. 添加第三方文档库参数
|
||||||
|
|
||||||
{{% alert icon="🤖 " context="success" %}}
|
首先,要进入 FastGPT 项目路径下的`FastGPT\packages\global\core\dataset\apiDataset.d.ts`文件,添加第三方文档库 Server 类型。例如,语雀文档中,需要提供`userId`、`token`两个字段作为鉴权信息。
|
||||||
知识库类型的字段设计是依赖于自己的知识库需要什么字段进行后续的api调用。
|
|
||||||
如果知识库有`根目录`选择的功能,需要设置添加一个字段`basePath`。[点击查看`根目录`功能](/docs/guide/knowledge_base/third_dataset/#添加配置表单)
|
|
||||||
{{% /alert %}}
|
|
||||||
|
|
||||||

|
```ts
|
||||||
|
export type YuqueServer = {
|
||||||
然后需要在 FastGPT 项目路径`FastGPT\packages\service\core\dataset\apiDataset\`下创建一个需要添加的文件夹,这里是`feishuKownledgeDataset`,在添加的文件夹下创建一个`api.ts`,如图:
|
userId: string;
|
||||||
|
token?: string;
|
||||||

|
basePath?: string;
|
||||||
|
|
||||||
## `api.ts`文件内容
|
|
||||||
|
|
||||||
首先,需要完成一些导入操作,例如
|
|
||||||
|
|
||||||
```TS
|
|
||||||
import type {
|
|
||||||
APIFileItem,
|
|
||||||
ApiFileReadContentResponse,
|
|
||||||
ApiDatasetDetailResponse,
|
|
||||||
FeishuKnowledgeServer //这里是之前添加的知识库类型Server
|
|
||||||
} from '@fastgpt/global/core/dataset/apiDataset';
|
|
||||||
import { type ParentIdType } from '@fastgpt/global/common/parentFolder/type';
|
|
||||||
import axios, { type Method } from 'axios';
|
|
||||||
import { addLog } from '../../../common/system/log';
|
|
||||||
```
|
|
||||||
|
|
||||||
之后定义一些返回体,需要根据自己要调用的 api 接口的返回类型进行设计。这里例如:
|
|
||||||
```TS
|
|
||||||
type ResponseDataType = {
|
|
||||||
success: boolean;
|
|
||||||
message: string;
|
|
||||||
data: any;
|
|
||||||
};
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Request
|
|
||||||
*/
|
|
||||||
type FeishuFileListResponse = {
|
|
||||||
items: {
|
|
||||||
title: string;
|
|
||||||
creator: string;
|
|
||||||
has_child: boolean;
|
|
||||||
parent_node_token: string;
|
|
||||||
owner_id: string;
|
|
||||||
space_id: string;
|
|
||||||
node_token: string;
|
|
||||||
node_type: string;
|
|
||||||
node_create_time: number;
|
|
||||||
obj_edit_time: number;
|
|
||||||
obj_create_time: number;
|
|
||||||
obj_token: string;
|
|
||||||
obj_type: string;
|
|
||||||
origin_node_token: string;
|
|
||||||
origin_space_id: string;
|
|
||||||
}[];
|
|
||||||
has_more: boolean;
|
|
||||||
next_page_token: string;
|
|
||||||
};
|
};
|
||||||
```
|
```
|
||||||
|
|
||||||
需要先设计设计一个函数,函数名以`知识库类型+Request`为例,例如:
|
|
||||||
|
|
||||||
```TS
|
|
||||||
export const useFeishuKnowledgeDatasetRequest = ({
|
|
||||||
feishuKnowledgeServer
|
|
||||||
}: {
|
|
||||||
feishuKnowledgeServer: FeishuKnowledgeServer;
|
|
||||||
}) => {}
|
|
||||||
```
|
|
||||||
|
|
||||||
函数定义完成后,需要完成 api 方法的设计,需要以下四个方法:
|
|
||||||
|
|
||||||
{{% alert icon="🤖 " context="success" %}}
|
{{% alert icon="🤖 " context="success" %}}
|
||||||
方法的具体设计,可以参考`projects\app\src\service\core\dataset\`下的任何一个知识库的`api.ts`文件,知识库文件夹以`dataset`结尾
|
如果文档库有`根目录`选择的功能,需要设置添加一个字段`basePath`
|
||||||
{{% /alert %}}
|
{{% /alert %}}
|
||||||
|
|
||||||
| 方法名 | 返回体 | 说明 |
|
### 2. 创建 Hook 文件
|
||||||
| --- | --- | --- |
|
|
||||||
| listFiles | id,parentId,name,type,hasChild,updateTime,createTime | 用于获取知识库的文件列表 |
|
|
||||||
| getFileContent | title,rawText | 用于获取知识库文件内容 |
|
|
||||||
| getFileDetail | name,parentId,id | 用于获取知识库文件详细信息 |
|
|
||||||
| getFilePreviewUrl | '网址' | 用于获取知识库文件原始页面 |
|
|
||||||
|
|
||||||
在设计好`api.ts`文件后,需要在`projects\app\src\service\core\dataset\apidataset\index.ts`里,添加之前写好的函数,例如:
|
每个第三方文档库都会采用 Hook 的方式来实现一套 API 接口的维护,Hook 里包含 4 个函数需要完成。
|
||||||
|
|
||||||

|
- 在`FastGPT\packages\service\core\dataset\apiDataset\`下创建一个文档库的文件夹,然后在文件夹下创建一个`api.ts`文件
|
||||||
|
- 在`api.ts`文件中,需要完成 4 个函数的定义,分别是:
|
||||||
|
- `listFiles`:获取文件列表
|
||||||
|
- `getFileContent`:获取文件内容/文件链接
|
||||||
|
- `getFileDetail`:获取文件详情信息
|
||||||
|
- `getFilePreviewUrl`:获取原文预览地址
|
||||||
|
|
||||||
在完成了这些之后,现在,我们需要一些方法的支持。在`index.ts`文件里,查找函数`getApiDatasetRequest`的引用,如图:
|
### 3. 数据库添加配置字段
|
||||||
|
|
||||||

|
- 在`packages/service/core/dataset/schema.ts` 中添加第三方文档库的配置字段,类型统一设置成`Object`。
|
||||||
|
- 在`FastGPT/packages/global/core/dataset/type.d.ts`中添加第三方文档库配置字段的数据类型,类型设置为第一步创建的参数。
|
||||||
{{% alert icon="🤖 " context="warning" %}}
|
|
||||||
其中`getCatalog.ts`和`getPathNames.ts`文件是对根路径设置的支持,如果你的知识库不支持根路径设置,可以设置返回空。[点击查看`根目录`功能](/docs/guide/knowledge_base/third_dataset/#添加配置表单)如图:
|
|
||||||
|
|
||||||

|
|
||||||
|
|
||||||
{{% /alert %}}
|
|
||||||
|
|
||||||
可以看到有一些文件引用这个函数,这些就是知识库的方法,现在我们需要进入这些文件添加我们的知识库类型。以`list.ts`为例,如图添加:
|
|
||||||
|
|
||||||

|
|
||||||
|
|
||||||
{{% alert icon="🤖 " context="success" %}}
|
|
||||||
方法的具体添加,可以参考文件内的其他知识库。
|
|
||||||
{{% /alert %}}
|
|
||||||
|
|
||||||
在`FastGPT\projects\app\src\pages\api\core\dataset\detail.ts`文件中,添加如下内容。
|
|
||||||
|
|
||||||

|
|
||||||
|
|
||||||
在`FastGPT\projects\app\src\pages\api\core\dataset\update.ts`文件中,添加如下内容。
|
|
||||||
|
|
||||||
{{% alert icon="🤖 " context="warning" %}}
|
|
||||||
该文件主要是负责更新知识库配置的,如果不添加,会导致无法正常更新配置。
|
|
||||||
{{% /alert %}}
|
|
||||||
|
|
||||||

|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
## 数据库类型添加
|
|
||||||
|
|
||||||
添加新的知识库,需要在`packages/service/core/dataset/schema.ts` 中添加自己的知识库类型,如图:
|
|
||||||
|
|
||||||

|

|
||||||
|
|
||||||
@ -150,10 +63,9 @@ export const useFeishuKnowledgeDatasetRequest = ({
|
|||||||
`schema.ts`文件修改后,需要重新启动 FastGPT 项目才会生效。
|
`schema.ts`文件修改后,需要重新启动 FastGPT 项目才会生效。
|
||||||
{{% /alert %}}
|
{{% /alert %}}
|
||||||
|
|
||||||
|
### 4. 添加知识库类型
|
||||||
|
|
||||||
## 添加知识库类型
|
在`projects/app/src/web/core/dataset/constants.ts`中,添加自己的知识库类型
|
||||||
|
|
||||||
添加完这些之后,需要添加知识库类型,需要在`projects/app/src/web/core/dataset/constants.ts`中,添加自己的知识库类型
|
|
||||||
|
|
||||||
```TS
|
```TS
|
||||||
export const datasetTypeCourseMap: Record<`${DatasetTypeEnum}`, string> = {
|
export const datasetTypeCourseMap: Record<`${DatasetTypeEnum}`, string> = {
|
||||||
|
|||||||
6
packages/global/core/dataset/api.d.ts
vendored
@ -17,6 +17,9 @@ import type { ParentIdType } from '../../common/parentFolder/type';
|
|||||||
/* ================= dataset ===================== */
|
/* ================= dataset ===================== */
|
||||||
export type DatasetUpdateBody = {
|
export type DatasetUpdateBody = {
|
||||||
id: string;
|
id: string;
|
||||||
|
|
||||||
|
apiDatasetServer?: DatasetSchemaType['apiDatasetServer'];
|
||||||
|
|
||||||
parentId?: ParentIdType;
|
parentId?: ParentIdType;
|
||||||
name?: string;
|
name?: string;
|
||||||
avatar?: string;
|
avatar?: string;
|
||||||
@ -28,9 +31,6 @@ export type DatasetUpdateBody = {
|
|||||||
websiteConfig?: DatasetSchemaType['websiteConfig'];
|
websiteConfig?: DatasetSchemaType['websiteConfig'];
|
||||||
externalReadUrl?: DatasetSchemaType['externalReadUrl'];
|
externalReadUrl?: DatasetSchemaType['externalReadUrl'];
|
||||||
defaultPermission?: DatasetSchemaType['defaultPermission'];
|
defaultPermission?: DatasetSchemaType['defaultPermission'];
|
||||||
apiServer?: DatasetSchemaType['apiServer'];
|
|
||||||
yuqueServer?: DatasetSchemaType['yuqueServer'];
|
|
||||||
feishuServer?: DatasetSchemaType['feishuServer'];
|
|
||||||
chunkSettings?: DatasetSchemaType['chunkSettings'];
|
chunkSettings?: DatasetSchemaType['chunkSettings'];
|
||||||
|
|
||||||
// sync schedule
|
// sync schedule
|
||||||
|
|||||||
@ -1,5 +1,5 @@
|
|||||||
import { RequireOnlyOne } from '../../common/type/utils';
|
import { RequireOnlyOne } from '../../../common/type/utils';
|
||||||
import type { ParentIdType } from '../../common/parentFolder/type.d';
|
import type { ParentIdType } from '../../../common/parentFolder/type';
|
||||||
|
|
||||||
export type APIFileItem = {
|
export type APIFileItem = {
|
||||||
id: string;
|
id: string;
|
||||||
@ -28,6 +28,12 @@ export type YuqueServer = {
|
|||||||
basePath?: string;
|
basePath?: string;
|
||||||
};
|
};
|
||||||
|
|
||||||
|
export type ApiDatasetServerType = {
|
||||||
|
apiServer?: APIFileServer;
|
||||||
|
feishuServer?: FeishuServer;
|
||||||
|
yuqueServer?: YuqueServer;
|
||||||
|
};
|
||||||
|
|
||||||
// Api dataset api
|
// Api dataset api
|
||||||
|
|
||||||
export type APIFileListResponse = APIFileItem[];
|
export type APIFileListResponse = APIFileItem[];
|
||||||
31
packages/global/core/dataset/apiDataset/utils.ts
Normal file
@ -0,0 +1,31 @@
|
|||||||
|
import type { ApiDatasetServerType } from './type';
|
||||||
|
|
||||||
|
export const filterApiDatasetServerPublicData = (apiDatasetServer?: ApiDatasetServerType) => {
|
||||||
|
if (!apiDatasetServer) return undefined;
|
||||||
|
|
||||||
|
const { apiServer, yuqueServer, feishuServer } = apiDatasetServer;
|
||||||
|
|
||||||
|
return {
|
||||||
|
apiServer: apiServer
|
||||||
|
? {
|
||||||
|
baseUrl: apiServer.baseUrl,
|
||||||
|
authorization: '',
|
||||||
|
basePath: apiServer.basePath
|
||||||
|
}
|
||||||
|
: undefined,
|
||||||
|
yuqueServer: yuqueServer
|
||||||
|
? {
|
||||||
|
userId: yuqueServer.userId,
|
||||||
|
token: '',
|
||||||
|
basePath: yuqueServer.basePath
|
||||||
|
}
|
||||||
|
: undefined,
|
||||||
|
feishuServer: feishuServer
|
||||||
|
? {
|
||||||
|
appId: feishuServer.appId,
|
||||||
|
appSecret: '',
|
||||||
|
folderToken: feishuServer.folderToken
|
||||||
|
}
|
||||||
|
: undefined
|
||||||
|
};
|
||||||
|
};
|
||||||
@ -6,11 +6,51 @@ export enum DatasetTypeEnum {
|
|||||||
dataset = 'dataset',
|
dataset = 'dataset',
|
||||||
websiteDataset = 'websiteDataset', // depp link
|
websiteDataset = 'websiteDataset', // depp link
|
||||||
externalFile = 'externalFile',
|
externalFile = 'externalFile',
|
||||||
|
|
||||||
apiDataset = 'apiDataset',
|
apiDataset = 'apiDataset',
|
||||||
feishu = 'feishu',
|
feishu = 'feishu',
|
||||||
yuque = 'yuque'
|
yuque = 'yuque'
|
||||||
}
|
}
|
||||||
export const DatasetTypeMap = {
|
|
||||||
|
// @ts-ignore
|
||||||
|
export const ApiDatasetTypeMap: Record<
|
||||||
|
`${DatasetTypeEnum}`,
|
||||||
|
{
|
||||||
|
icon: string;
|
||||||
|
label: any;
|
||||||
|
collectionLabel: string;
|
||||||
|
courseUrl?: string;
|
||||||
|
}
|
||||||
|
> = {
|
||||||
|
[DatasetTypeEnum.apiDataset]: {
|
||||||
|
icon: 'core/dataset/externalDatasetOutline',
|
||||||
|
label: i18nT('dataset:api_file'),
|
||||||
|
collectionLabel: i18nT('common:File'),
|
||||||
|
courseUrl: '/docs/guide/knowledge_base/api_dataset/'
|
||||||
|
},
|
||||||
|
[DatasetTypeEnum.feishu]: {
|
||||||
|
icon: 'core/dataset/feishuDatasetOutline',
|
||||||
|
label: i18nT('dataset:feishu_dataset'),
|
||||||
|
collectionLabel: i18nT('common:File'),
|
||||||
|
courseUrl: '/docs/guide/knowledge_base/lark_dataset/'
|
||||||
|
},
|
||||||
|
[DatasetTypeEnum.yuque]: {
|
||||||
|
icon: 'core/dataset/yuqueDatasetOutline',
|
||||||
|
label: i18nT('dataset:yuque_dataset'),
|
||||||
|
collectionLabel: i18nT('common:File'),
|
||||||
|
courseUrl: '/docs/guide/knowledge_base/yuque_dataset/'
|
||||||
|
}
|
||||||
|
};
|
||||||
|
export const DatasetTypeMap: Record<
|
||||||
|
`${DatasetTypeEnum}`,
|
||||||
|
{
|
||||||
|
icon: string;
|
||||||
|
label: any;
|
||||||
|
collectionLabel: string;
|
||||||
|
courseUrl?: string;
|
||||||
|
}
|
||||||
|
> = {
|
||||||
|
...ApiDatasetTypeMap,
|
||||||
[DatasetTypeEnum.folder]: {
|
[DatasetTypeEnum.folder]: {
|
||||||
icon: 'common/folderFill',
|
icon: 'common/folderFill',
|
||||||
label: i18nT('dataset:folder_dataset'),
|
label: i18nT('dataset:folder_dataset'),
|
||||||
@ -24,27 +64,13 @@ export const DatasetTypeMap = {
|
|||||||
[DatasetTypeEnum.websiteDataset]: {
|
[DatasetTypeEnum.websiteDataset]: {
|
||||||
icon: 'core/dataset/websiteDatasetOutline',
|
icon: 'core/dataset/websiteDatasetOutline',
|
||||||
label: i18nT('dataset:website_dataset'),
|
label: i18nT('dataset:website_dataset'),
|
||||||
collectionLabel: i18nT('common:Website')
|
collectionLabel: i18nT('common:Website'),
|
||||||
|
courseUrl: '/docs/guide/knowledge_base/websync/'
|
||||||
},
|
},
|
||||||
[DatasetTypeEnum.externalFile]: {
|
[DatasetTypeEnum.externalFile]: {
|
||||||
icon: 'core/dataset/externalDatasetOutline',
|
icon: 'core/dataset/externalDatasetOutline',
|
||||||
label: i18nT('dataset:external_file'),
|
label: i18nT('dataset:external_file'),
|
||||||
collectionLabel: i18nT('common:File')
|
collectionLabel: i18nT('common:File')
|
||||||
},
|
|
||||||
[DatasetTypeEnum.apiDataset]: {
|
|
||||||
icon: 'core/dataset/externalDatasetOutline',
|
|
||||||
label: i18nT('dataset:api_file'),
|
|
||||||
collectionLabel: i18nT('common:File')
|
|
||||||
},
|
|
||||||
[DatasetTypeEnum.feishu]: {
|
|
||||||
icon: 'core/dataset/feishuDatasetOutline',
|
|
||||||
label: i18nT('dataset:feishu_dataset'),
|
|
||||||
collectionLabel: i18nT('common:File')
|
|
||||||
},
|
|
||||||
[DatasetTypeEnum.yuque]: {
|
|
||||||
icon: 'core/dataset/yuqueDatasetOutline',
|
|
||||||
label: i18nT('dataset:yuque_dataset'),
|
|
||||||
collectionLabel: i18nT('common:File')
|
|
||||||
}
|
}
|
||||||
};
|
};
|
||||||
|
|
||||||
|
|||||||
15
packages/global/core/dataset/type.d.ts
vendored
@ -13,7 +13,12 @@ import type {
|
|||||||
ChunkTriggerConfigTypeEnum
|
ChunkTriggerConfigTypeEnum
|
||||||
} from './constants';
|
} from './constants';
|
||||||
import type { DatasetPermission } from '../../support/permission/dataset/controller';
|
import type { DatasetPermission } from '../../support/permission/dataset/controller';
|
||||||
import type { APIFileServer, FeishuServer, YuqueServer } from './apiDataset';
|
import type {
|
||||||
|
ApiDatasetServerType,
|
||||||
|
APIFileServer,
|
||||||
|
FeishuServer,
|
||||||
|
YuqueServer
|
||||||
|
} from './apiDataset/type';
|
||||||
import type { SourceMemberType } from 'support/user/type';
|
import type { SourceMemberType } from 'support/user/type';
|
||||||
import type { DatasetDataIndexTypeEnum } from './data/constants';
|
import type { DatasetDataIndexTypeEnum } from './data/constants';
|
||||||
import type { ParentIdType } from 'common/parentFolder/type';
|
import type { ParentIdType } from 'common/parentFolder/type';
|
||||||
@ -73,14 +78,16 @@ export type DatasetSchemaType = {
|
|||||||
chunkSettings?: ChunkSettingsType;
|
chunkSettings?: ChunkSettingsType;
|
||||||
|
|
||||||
inheritPermission: boolean;
|
inheritPermission: boolean;
|
||||||
apiServer?: APIFileServer;
|
|
||||||
feishuServer?: FeishuServer;
|
apiDatasetServer?: ApiDatasetServerType;
|
||||||
yuqueServer?: YuqueServer;
|
|
||||||
|
|
||||||
// abandon
|
// abandon
|
||||||
autoSync?: boolean;
|
autoSync?: boolean;
|
||||||
externalReadUrl?: string;
|
externalReadUrl?: string;
|
||||||
defaultPermission?: number;
|
defaultPermission?: number;
|
||||||
|
apiServer?: APIFileServer;
|
||||||
|
feishuServer?: FeishuServer;
|
||||||
|
yuqueServer?: YuqueServer;
|
||||||
};
|
};
|
||||||
|
|
||||||
export type DatasetCollectionSchemaType = ChunkSettingsType & {
|
export type DatasetCollectionSchemaType = ChunkSettingsType & {
|
||||||
|
|||||||
7
packages/service/common/api/type.d.ts
vendored
@ -1,5 +1,8 @@
|
|||||||
import type { ApiDatasetDetailResponse } from '@fastgpt/global/core/dataset/apiDataset';
|
import type {
|
||||||
import { FeishuServer, YuqueServer } from '@fastgpt/global/core/dataset/apiDataset';
|
ApiDatasetDetailResponse,
|
||||||
|
FeishuServer,
|
||||||
|
YuqueServer
|
||||||
|
} from '@fastgpt/global/core/dataset/apiDataset/type';
|
||||||
import type {
|
import type {
|
||||||
DeepRagSearchProps,
|
DeepRagSearchProps,
|
||||||
SearchDatasetDataResponse
|
SearchDatasetDataResponse
|
||||||
|
|||||||
@ -3,12 +3,11 @@ import type {
|
|||||||
ApiFileReadContentResponse,
|
ApiFileReadContentResponse,
|
||||||
APIFileReadResponse,
|
APIFileReadResponse,
|
||||||
ApiDatasetDetailResponse,
|
ApiDatasetDetailResponse,
|
||||||
APIFileServer,
|
APIFileServer
|
||||||
APIFileItem
|
} from '@fastgpt/global/core/dataset/apiDataset/type';
|
||||||
} from '@fastgpt/global/core/dataset/apiDataset';
|
|
||||||
import axios, { type Method } from 'axios';
|
import axios, { type Method } from 'axios';
|
||||||
import { addLog } from '../../../common/system/log';
|
import { addLog } from '../../../../common/system/log';
|
||||||
import { readFileRawTextByUrl } from '../read';
|
import { readFileRawTextByUrl } from '../../read';
|
||||||
import { type ParentIdType } from '@fastgpt/global/common/parentFolder/type';
|
import { type ParentIdType } from '@fastgpt/global/common/parentFolder/type';
|
||||||
import { type RequireOnlyOne } from '@fastgpt/global/common/type/utils';
|
import { type RequireOnlyOne } from '@fastgpt/global/common/type/utils';
|
||||||
|
|
||||||
@ -3,10 +3,10 @@ import type {
|
|||||||
ApiFileReadContentResponse,
|
ApiFileReadContentResponse,
|
||||||
ApiDatasetDetailResponse,
|
ApiDatasetDetailResponse,
|
||||||
FeishuServer
|
FeishuServer
|
||||||
} from '@fastgpt/global/core/dataset/apiDataset';
|
} from '@fastgpt/global/core/dataset/apiDataset/type';
|
||||||
import { type ParentIdType } from '@fastgpt/global/common/parentFolder/type';
|
import { type ParentIdType } from '@fastgpt/global/common/parentFolder/type';
|
||||||
import axios, { type Method } from 'axios';
|
import axios, { type Method } from 'axios';
|
||||||
import { addLog } from '../../../common/system/log';
|
import { addLog } from '../../../../common/system/log';
|
||||||
|
|
||||||
type ResponseDataType = {
|
type ResponseDataType = {
|
||||||
success: boolean;
|
success: boolean;
|
||||||
@ -1,18 +1,10 @@
|
|||||||
import type {
|
import { useApiDatasetRequest } from './custom/api';
|
||||||
APIFileServer,
|
import { useYuqueDatasetRequest } from './yuqueDataset/api';
|
||||||
YuqueServer,
|
import { useFeishuDatasetRequest } from './feishuDataset/api';
|
||||||
FeishuServer
|
import type { ApiDatasetServerType } from '@fastgpt/global/core/dataset/apiDataset/type';
|
||||||
} from '@fastgpt/global/core/dataset/apiDataset';
|
|
||||||
import { useApiDatasetRequest } from './api';
|
|
||||||
import { useYuqueDatasetRequest } from '../yuqueDataset/api';
|
|
||||||
import { useFeishuDatasetRequest } from '../feishuDataset/api';
|
|
||||||
|
|
||||||
export const getApiDatasetRequest = async (data: {
|
export const getApiDatasetRequest = async (apiDatasetServer?: ApiDatasetServerType) => {
|
||||||
apiServer?: APIFileServer;
|
const { apiServer, yuqueServer, feishuServer } = apiDatasetServer || {};
|
||||||
yuqueServer?: YuqueServer;
|
|
||||||
feishuServer?: FeishuServer;
|
|
||||||
}) => {
|
|
||||||
const { apiServer, yuqueServer, feishuServer } = data;
|
|
||||||
|
|
||||||
if (apiServer) {
|
if (apiServer) {
|
||||||
return useApiDatasetRequest({ apiServer });
|
return useApiDatasetRequest({ apiServer });
|
||||||
|
|||||||
@ -3,9 +3,9 @@ import type {
|
|||||||
ApiFileReadContentResponse,
|
ApiFileReadContentResponse,
|
||||||
YuqueServer,
|
YuqueServer,
|
||||||
ApiDatasetDetailResponse
|
ApiDatasetDetailResponse
|
||||||
} from '@fastgpt/global/core/dataset/apiDataset';
|
} from '@fastgpt/global/core/dataset/apiDataset/type';
|
||||||
import axios, { type Method } from 'axios';
|
import axios, { type Method } from 'axios';
|
||||||
import { addLog } from '../../../common/system/log';
|
import { addLog } from '../../../../common/system/log';
|
||||||
import { type ParentIdType } from '@fastgpt/global/common/parentFolder/type';
|
import { type ParentIdType } from '@fastgpt/global/common/parentFolder/type';
|
||||||
|
|
||||||
type ResponseDataType = {
|
type ResponseDataType = {
|
||||||
@ -105,7 +105,6 @@ export const useYuqueDatasetRequest = ({ yuqueServer }: { yuqueServer: YuqueServ
|
|||||||
if (!parentId) {
|
if (!parentId) {
|
||||||
if (yuqueServer.basePath) parentId = yuqueServer.basePath;
|
if (yuqueServer.basePath) parentId = yuqueServer.basePath;
|
||||||
}
|
}
|
||||||
|
|
||||||
let files: APIFileItem[] = [];
|
let files: APIFileItem[] = [];
|
||||||
|
|
||||||
if (!parentId) {
|
if (!parentId) {
|
||||||
@ -157,9 +157,7 @@ export const syncCollection = async (collection: CollectionWithDatasetType) => {
|
|||||||
return {
|
return {
|
||||||
type: DatasetSourceReadTypeEnum.apiFile,
|
type: DatasetSourceReadTypeEnum.apiFile,
|
||||||
sourceId,
|
sourceId,
|
||||||
apiServer: dataset.apiServer,
|
apiDatasetServer: dataset.apiDatasetServer
|
||||||
feishuServer: dataset.feishuServer,
|
|
||||||
yuqueServer: dataset.yuqueServer
|
|
||||||
};
|
};
|
||||||
})();
|
})();
|
||||||
|
|
||||||
|
|||||||
@ -9,13 +9,9 @@ import { type TextSplitProps, splitText2Chunks } from '@fastgpt/global/common/st
|
|||||||
import axios from 'axios';
|
import axios from 'axios';
|
||||||
import { readRawContentByFileBuffer } from '../../common/file/read/utils';
|
import { readRawContentByFileBuffer } from '../../common/file/read/utils';
|
||||||
import { parseFileExtensionFromUrl } from '@fastgpt/global/common/string/tools';
|
import { parseFileExtensionFromUrl } from '@fastgpt/global/common/string/tools';
|
||||||
import {
|
|
||||||
type APIFileServer,
|
|
||||||
type FeishuServer,
|
|
||||||
type YuqueServer
|
|
||||||
} from '@fastgpt/global/core/dataset/apiDataset';
|
|
||||||
import { getApiDatasetRequest } from './apiDataset';
|
import { getApiDatasetRequest } from './apiDataset';
|
||||||
import Papa from 'papaparse';
|
import Papa from 'papaparse';
|
||||||
|
import type { ApiDatasetServerType } from '@fastgpt/global/core/dataset/apiDataset/type';
|
||||||
|
|
||||||
export const readFileRawTextByUrl = async ({
|
export const readFileRawTextByUrl = async ({
|
||||||
teamId,
|
teamId,
|
||||||
@ -69,9 +65,7 @@ export const readDatasetSourceRawText = async ({
|
|||||||
sourceId,
|
sourceId,
|
||||||
selector,
|
selector,
|
||||||
externalFileId,
|
externalFileId,
|
||||||
apiServer,
|
apiDatasetServer,
|
||||||
feishuServer,
|
|
||||||
yuqueServer,
|
|
||||||
customPdfParse,
|
customPdfParse,
|
||||||
getFormatText
|
getFormatText
|
||||||
}: {
|
}: {
|
||||||
@ -84,9 +78,7 @@ export const readDatasetSourceRawText = async ({
|
|||||||
|
|
||||||
selector?: string; // link selector
|
selector?: string; // link selector
|
||||||
externalFileId?: string; // external file dataset
|
externalFileId?: string; // external file dataset
|
||||||
apiServer?: APIFileServer; // api dataset
|
apiDatasetServer?: ApiDatasetServerType; // api dataset
|
||||||
feishuServer?: FeishuServer; // feishu dataset
|
|
||||||
yuqueServer?: YuqueServer; // yuque dataset
|
|
||||||
}): Promise<{
|
}): Promise<{
|
||||||
title?: string;
|
title?: string;
|
||||||
rawText: string;
|
rawText: string;
|
||||||
@ -128,9 +120,7 @@ export const readDatasetSourceRawText = async ({
|
|||||||
};
|
};
|
||||||
} else if (type === DatasetSourceReadTypeEnum.apiFile) {
|
} else if (type === DatasetSourceReadTypeEnum.apiFile) {
|
||||||
const { title, rawText } = await readApiServerFileContent({
|
const { title, rawText } = await readApiServerFileContent({
|
||||||
apiServer,
|
apiDatasetServer,
|
||||||
feishuServer,
|
|
||||||
yuqueServer,
|
|
||||||
apiFileId: sourceId,
|
apiFileId: sourceId,
|
||||||
teamId,
|
teamId,
|
||||||
tmbId
|
tmbId
|
||||||
@ -147,17 +137,13 @@ export const readDatasetSourceRawText = async ({
|
|||||||
};
|
};
|
||||||
|
|
||||||
export const readApiServerFileContent = async ({
|
export const readApiServerFileContent = async ({
|
||||||
apiServer,
|
apiDatasetServer,
|
||||||
feishuServer,
|
|
||||||
yuqueServer,
|
|
||||||
apiFileId,
|
apiFileId,
|
||||||
teamId,
|
teamId,
|
||||||
tmbId,
|
tmbId,
|
||||||
customPdfParse
|
customPdfParse
|
||||||
}: {
|
}: {
|
||||||
apiServer?: APIFileServer;
|
apiDatasetServer?: ApiDatasetServerType;
|
||||||
feishuServer?: FeishuServer;
|
|
||||||
yuqueServer?: YuqueServer;
|
|
||||||
apiFileId: string;
|
apiFileId: string;
|
||||||
teamId: string;
|
teamId: string;
|
||||||
tmbId: string;
|
tmbId: string;
|
||||||
@ -166,13 +152,7 @@ export const readApiServerFileContent = async ({
|
|||||||
title?: string;
|
title?: string;
|
||||||
rawText: string;
|
rawText: string;
|
||||||
}> => {
|
}> => {
|
||||||
return (
|
return (await getApiDatasetRequest(apiDatasetServer)).getFileContent({
|
||||||
await getApiDatasetRequest({
|
|
||||||
apiServer,
|
|
||||||
yuqueServer,
|
|
||||||
feishuServer
|
|
||||||
})
|
|
||||||
).getFileContent({
|
|
||||||
teamId,
|
teamId,
|
||||||
tmbId,
|
tmbId,
|
||||||
apiFileId,
|
apiFileId,
|
||||||
|
|||||||
@ -127,14 +127,16 @@ const DatasetSchema = new Schema({
|
|||||||
type: Boolean,
|
type: Boolean,
|
||||||
default: true
|
default: true
|
||||||
},
|
},
|
||||||
apiServer: Object,
|
|
||||||
feishuServer: Object,
|
apiDatasetServer: Object,
|
||||||
yuqueServer: Object,
|
|
||||||
|
|
||||||
// abandoned
|
// abandoned
|
||||||
autoSync: Boolean,
|
autoSync: Boolean,
|
||||||
externalReadUrl: String,
|
externalReadUrl: String,
|
||||||
defaultPermission: Number
|
defaultPermission: Number,
|
||||||
|
apiServer: Object,
|
||||||
|
feishuServer: Object,
|
||||||
|
yuqueServer: Object
|
||||||
});
|
});
|
||||||
|
|
||||||
try {
|
try {
|
||||||
|
|||||||
@ -6,7 +6,7 @@ import type {
|
|||||||
APIFileServer,
|
APIFileServer,
|
||||||
FeishuServer,
|
FeishuServer,
|
||||||
YuqueServer
|
YuqueServer
|
||||||
} from '@fastgpt/global/core/dataset/apiDataset';
|
} from '@fastgpt/global/core/dataset/apiDataset/type';
|
||||||
import type {
|
import type {
|
||||||
DatasetSearchModeEnum,
|
DatasetSearchModeEnum,
|
||||||
DatasetTypeEnum
|
DatasetTypeEnum
|
||||||
@ -17,6 +17,7 @@ import {
|
|||||||
TrainingModeEnum
|
TrainingModeEnum
|
||||||
} from '@fastgpt/global/core/dataset/constants';
|
} from '@fastgpt/global/core/dataset/constants';
|
||||||
import type { SearchDataResponseItemType } from '@fastgpt/global/core/dataset/type';
|
import type { SearchDataResponseItemType } from '@fastgpt/global/core/dataset/type';
|
||||||
|
import type { ApiDatasetServerType } from '@fastgpt/global/core/dataset/apiDataset/type';
|
||||||
import { DatasetDataIndexItemType } from '@fastgpt/global/core/dataset/type';
|
import { DatasetDataIndexItemType } from '@fastgpt/global/core/dataset/type';
|
||||||
import type { NodeInputKeyEnum } from '@fastgpt/global/core/workflow/constants';
|
import type { NodeInputKeyEnum } from '@fastgpt/global/core/workflow/constants';
|
||||||
import { PermissionValueType } from '@fastgpt/global/support/permission/type';
|
import { PermissionValueType } from '@fastgpt/global/support/permission/type';
|
||||||
@ -31,9 +32,7 @@ export type CreateDatasetParams = {
|
|||||||
vectorModel?: string;
|
vectorModel?: string;
|
||||||
agentModel?: string;
|
agentModel?: string;
|
||||||
vlmModel?: string;
|
vlmModel?: string;
|
||||||
apiServer?: APIFileServer;
|
apiDatasetServer?: ApiDatasetServerType;
|
||||||
feishuServer?: FeishuServer;
|
|
||||||
yuqueServer?: YuqueServer;
|
|
||||||
};
|
};
|
||||||
|
|
||||||
export type RebuildEmbeddingProps = {
|
export type RebuildEmbeddingProps = {
|
||||||
|
|||||||
@ -3,11 +3,6 @@ import { DatasetTypeEnum } from '@fastgpt/global/core/dataset/constants';
|
|||||||
import { Flex, Input, Button, ModalBody, ModalFooter, Box } from '@chakra-ui/react';
|
import { Flex, Input, Button, ModalBody, ModalFooter, Box } from '@chakra-ui/react';
|
||||||
import type { UseFormReturn } from 'react-hook-form';
|
import type { UseFormReturn } from 'react-hook-form';
|
||||||
import { useTranslation } from 'next-i18next';
|
import { useTranslation } from 'next-i18next';
|
||||||
import type {
|
|
||||||
APIFileServer,
|
|
||||||
FeishuServer,
|
|
||||||
YuqueServer
|
|
||||||
} from '@fastgpt/global/core/dataset/apiDataset';
|
|
||||||
import { getApiDatasetPaths, getApiDatasetCatalog } from '@/web/core/dataset/api';
|
import { getApiDatasetPaths, getApiDatasetCatalog } from '@/web/core/dataset/api';
|
||||||
import type {
|
import type {
|
||||||
GetResourceFolderListItemResponse,
|
GetResourceFolderListItemResponse,
|
||||||
@ -22,6 +17,7 @@ import FormLabel from '@fastgpt/web/components/common/MyBox/FormLabel';
|
|||||||
import MyModal from '@fastgpt/web/components/common/MyModal';
|
import MyModal from '@fastgpt/web/components/common/MyModal';
|
||||||
import MyIcon from '@fastgpt/web/components/common/Icon';
|
import MyIcon from '@fastgpt/web/components/common/Icon';
|
||||||
import { FolderIcon } from '@fastgpt/global/common/file/image/constants';
|
import { FolderIcon } from '@fastgpt/global/common/file/image/constants';
|
||||||
|
import type { ApiDatasetServerType } from '@fastgpt/global/core/dataset/apiDataset/type';
|
||||||
|
|
||||||
const ApiDatasetForm = ({
|
const ApiDatasetForm = ({
|
||||||
type,
|
type,
|
||||||
@ -32,9 +28,7 @@ const ApiDatasetForm = ({
|
|||||||
datasetId?: string;
|
datasetId?: string;
|
||||||
form: UseFormReturn<
|
form: UseFormReturn<
|
||||||
{
|
{
|
||||||
apiServer?: APIFileServer;
|
apiDatasetServer?: ApiDatasetServerType;
|
||||||
feishuServer?: FeishuServer;
|
|
||||||
yuqueServer?: YuqueServer;
|
|
||||||
},
|
},
|
||||||
any
|
any
|
||||||
>;
|
>;
|
||||||
@ -42,9 +36,10 @@ const ApiDatasetForm = ({
|
|||||||
const { t } = useTranslation();
|
const { t } = useTranslation();
|
||||||
const { register, setValue, watch } = form;
|
const { register, setValue, watch } = form;
|
||||||
|
|
||||||
const yuqueServer = watch('yuqueServer');
|
const apiDatasetServer = watch('apiDatasetServer');
|
||||||
const feishuServer = watch('feishuServer');
|
const yuqueServer = apiDatasetServer?.yuqueServer;
|
||||||
const apiServer = watch('apiServer');
|
const feishuServer = apiDatasetServer?.feishuServer;
|
||||||
|
const apiServer = apiDatasetServer?.apiServer;
|
||||||
|
|
||||||
const [pathNames, setPathNames] = useState(t('dataset:rootdirectory'));
|
const [pathNames, setPathNames] = useState(t('dataset:rootdirectory'));
|
||||||
const [
|
const [
|
||||||
@ -91,9 +86,7 @@ const ApiDatasetForm = ({
|
|||||||
const path = await getApiDatasetPaths({
|
const path = await getApiDatasetPaths({
|
||||||
datasetId,
|
datasetId,
|
||||||
parentId,
|
parentId,
|
||||||
yuqueServer,
|
apiDatasetServer
|
||||||
feishuServer,
|
|
||||||
apiServer
|
|
||||||
});
|
});
|
||||||
setPathNames(path);
|
setPathNames(path);
|
||||||
},
|
},
|
||||||
@ -108,13 +101,13 @@ const ApiDatasetForm = ({
|
|||||||
const value = id === 'root' || id === null || id === undefined ? '' : id;
|
const value = id === 'root' || id === null || id === undefined ? '' : id;
|
||||||
switch (type) {
|
switch (type) {
|
||||||
case DatasetTypeEnum.yuque:
|
case DatasetTypeEnum.yuque:
|
||||||
setValue('yuqueServer.basePath', value);
|
setValue('apiDatasetServer.yuqueServer.basePath', value);
|
||||||
break;
|
break;
|
||||||
case DatasetTypeEnum.feishu:
|
case DatasetTypeEnum.feishu:
|
||||||
setValue('feishuServer.folderToken', value);
|
setValue('apiDatasetServer.feishuServer.folderToken', value);
|
||||||
break;
|
break;
|
||||||
case DatasetTypeEnum.apiDataset:
|
case DatasetTypeEnum.apiDataset:
|
||||||
setValue('apiServer.basePath', value);
|
setValue('apiDatasetServer.apiServer.basePath', value);
|
||||||
break;
|
break;
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -147,32 +140,10 @@ const ApiDatasetForm = ({
|
|||||||
<BaseUrlSelector
|
<BaseUrlSelector
|
||||||
selectId={yuqueServer?.basePath || apiServer?.basePath || 'root'}
|
selectId={yuqueServer?.basePath || apiServer?.basePath || 'root'}
|
||||||
server={async (e: GetResourceFolderListProps) => {
|
server={async (e: GetResourceFolderListProps) => {
|
||||||
const params: GetApiDatasetCataLogProps = { parentId: e.parentId };
|
const params: GetApiDatasetCataLogProps = {
|
||||||
|
parentId: e.parentId,
|
||||||
switch (type) {
|
apiDatasetServer
|
||||||
case DatasetTypeEnum.yuque:
|
|
||||||
params.yuqueServer = {
|
|
||||||
userId: yuqueServer?.userId || '',
|
|
||||||
token: yuqueServer?.token || '',
|
|
||||||
basePath: ''
|
|
||||||
};
|
};
|
||||||
break;
|
|
||||||
// Currently, only Yuque is using it
|
|
||||||
case DatasetTypeEnum.feishu:
|
|
||||||
params.feishuServer = {
|
|
||||||
appId: feishuServer?.appId || '',
|
|
||||||
appSecret: feishuServer?.appSecret || '',
|
|
||||||
folderToken: feishuServer?.folderToken || ''
|
|
||||||
};
|
|
||||||
break;
|
|
||||||
case DatasetTypeEnum.apiDataset:
|
|
||||||
params.apiServer = {
|
|
||||||
baseUrl: apiServer?.baseUrl || '',
|
|
||||||
authorization: apiServer?.authorization || '',
|
|
||||||
basePath: ''
|
|
||||||
};
|
|
||||||
break;
|
|
||||||
}
|
|
||||||
|
|
||||||
return getApiDatasetCatalog(params);
|
return getApiDatasetCatalog(params);
|
||||||
}}
|
}}
|
||||||
@ -193,7 +164,7 @@ const ApiDatasetForm = ({
|
|||||||
bg={'myWhite.600'}
|
bg={'myWhite.600'}
|
||||||
placeholder={t('dataset:api_url')}
|
placeholder={t('dataset:api_url')}
|
||||||
maxLength={200}
|
maxLength={200}
|
||||||
{...register('apiServer.baseUrl', { required: true })}
|
{...register('apiDatasetServer.apiServer.baseUrl', { required: true })}
|
||||||
/>
|
/>
|
||||||
</Flex>
|
</Flex>
|
||||||
<Flex mt={6} alignItems={'center'}>
|
<Flex mt={6} alignItems={'center'}>
|
||||||
@ -204,7 +175,7 @@ const ApiDatasetForm = ({
|
|||||||
bg={'myWhite.600'}
|
bg={'myWhite.600'}
|
||||||
placeholder={t('dataset:request_headers')}
|
placeholder={t('dataset:request_headers')}
|
||||||
maxLength={2000}
|
maxLength={2000}
|
||||||
{...register('apiServer.authorization')}
|
{...register('apiDatasetServer.apiServer.authorization')}
|
||||||
/>
|
/>
|
||||||
</Flex>
|
</Flex>
|
||||||
{renderBaseUrlSelector()}
|
{renderBaseUrlSelector()}
|
||||||
@ -227,7 +198,7 @@ const ApiDatasetForm = ({
|
|||||||
bg={'myWhite.600'}
|
bg={'myWhite.600'}
|
||||||
placeholder={'App ID'}
|
placeholder={'App ID'}
|
||||||
maxLength={200}
|
maxLength={200}
|
||||||
{...register('feishuServer.appId', { required: true })}
|
{...register('apiDatasetServer.feishuServer.appId', { required: true })}
|
||||||
/>
|
/>
|
||||||
</Flex>
|
</Flex>
|
||||||
<Flex mt={6}>
|
<Flex mt={6}>
|
||||||
@ -244,7 +215,7 @@ const ApiDatasetForm = ({
|
|||||||
bg={'myWhite.600'}
|
bg={'myWhite.600'}
|
||||||
placeholder={'App Secret'}
|
placeholder={'App Secret'}
|
||||||
maxLength={200}
|
maxLength={200}
|
||||||
{...register('feishuServer.appSecret', { required: true })}
|
{...register('apiDatasetServer.feishuServer.appSecret', { required: true })}
|
||||||
/>
|
/>
|
||||||
</Flex>
|
</Flex>
|
||||||
<Flex mt={6}>
|
<Flex mt={6}>
|
||||||
@ -261,7 +232,7 @@ const ApiDatasetForm = ({
|
|||||||
bg={'myWhite.600'}
|
bg={'myWhite.600'}
|
||||||
placeholder={'Folder Token'}
|
placeholder={'Folder Token'}
|
||||||
maxLength={200}
|
maxLength={200}
|
||||||
{...register('feishuServer.folderToken', { required: true })}
|
{...register('apiDatasetServer.feishuServer.folderToken', { required: true })}
|
||||||
/>
|
/>
|
||||||
</Flex>
|
</Flex>
|
||||||
{/* {renderBaseUrlSelector()}
|
{/* {renderBaseUrlSelector()}
|
||||||
@ -278,7 +249,7 @@ const ApiDatasetForm = ({
|
|||||||
bg={'myWhite.600'}
|
bg={'myWhite.600'}
|
||||||
placeholder={'User ID'}
|
placeholder={'User ID'}
|
||||||
maxLength={200}
|
maxLength={200}
|
||||||
{...register('yuqueServer.userId', { required: true })}
|
{...register('apiDatasetServer.yuqueServer.userId', { required: true })}
|
||||||
/>
|
/>
|
||||||
</Flex>
|
</Flex>
|
||||||
<Flex mt={6} alignItems={'center'}>
|
<Flex mt={6} alignItems={'center'}>
|
||||||
@ -289,7 +260,7 @@ const ApiDatasetForm = ({
|
|||||||
bg={'myWhite.600'}
|
bg={'myWhite.600'}
|
||||||
placeholder={'Token'}
|
placeholder={'Token'}
|
||||||
maxLength={200}
|
maxLength={200}
|
||||||
{...register('yuqueServer.token', { required: true })}
|
{...register('apiDatasetServer.yuqueServer.token', { required: true })}
|
||||||
/>
|
/>
|
||||||
</Flex>
|
</Flex>
|
||||||
{renderBaseUrlSelector()}
|
{renderBaseUrlSelector()}
|
||||||
|
|||||||
@ -17,7 +17,8 @@ import {
|
|||||||
DatasetCollectionTypeEnum,
|
DatasetCollectionTypeEnum,
|
||||||
DatasetTypeEnum,
|
DatasetTypeEnum,
|
||||||
DatasetTypeMap,
|
DatasetTypeMap,
|
||||||
DatasetStatusEnum
|
DatasetStatusEnum,
|
||||||
|
ApiDatasetTypeMap
|
||||||
} from '@fastgpt/global/core/dataset/constants';
|
} from '@fastgpt/global/core/dataset/constants';
|
||||||
import EditFolderModal, { useEditFolder } from '../../EditFolderModal';
|
import EditFolderModal, { useEditFolder } from '../../EditFolderModal';
|
||||||
import { TabEnum } from '../../../../pages/dataset/detail/index';
|
import { TabEnum } from '../../../../pages/dataset/detail/index';
|
||||||
@ -435,9 +436,7 @@ const Header = ({ hasTrainingData }: { hasTrainingData: boolean }) => {
|
|||||||
/>
|
/>
|
||||||
)}
|
)}
|
||||||
{/* apiDataset */}
|
{/* apiDataset */}
|
||||||
{(datasetDetail?.type === DatasetTypeEnum.apiDataset ||
|
{datasetDetail?.type && ApiDatasetTypeMap[datasetDetail.type] && (
|
||||||
datasetDetail?.type === DatasetTypeEnum.feishu ||
|
|
||||||
datasetDetail?.type === DatasetTypeEnum.yuque) && (
|
|
||||||
<Flex
|
<Flex
|
||||||
px={3.5}
|
px={3.5}
|
||||||
py={2}
|
py={2}
|
||||||
|
|||||||
@ -13,7 +13,7 @@ import { type ParentTreePathItemType } from '@fastgpt/global/common/parentFolder
|
|||||||
import FolderPath from '@/components/common/folder/Path';
|
import FolderPath from '@/components/common/folder/Path';
|
||||||
import { getSourceNameIcon } from '@fastgpt/global/core/dataset/utils';
|
import { getSourceNameIcon } from '@fastgpt/global/core/dataset/utils';
|
||||||
import MyBox from '@fastgpt/web/components/common/MyBox';
|
import MyBox from '@fastgpt/web/components/common/MyBox';
|
||||||
import { type APIFileItem } from '@fastgpt/global/core/dataset/apiDataset';
|
import { type APIFileItem } from '@fastgpt/global/core/dataset/apiDataset/type';
|
||||||
import SearchInput from '@fastgpt/web/components/common/Input/SearchInput';
|
import SearchInput from '@fastgpt/web/components/common/Input/SearchInput';
|
||||||
import { useMount } from 'ahooks';
|
import { useMount } from 'ahooks';
|
||||||
|
|
||||||
|
|||||||
@ -5,23 +5,17 @@ import { useTranslation } from 'next-i18next';
|
|||||||
import { useRequest2 } from '@fastgpt/web/hooks/useRequest';
|
import { useRequest2 } from '@fastgpt/web/hooks/useRequest';
|
||||||
import { useForm } from 'react-hook-form';
|
import { useForm } from 'react-hook-form';
|
||||||
import { useToast } from '@fastgpt/web/hooks/useToast';
|
import { useToast } from '@fastgpt/web/hooks/useToast';
|
||||||
import {
|
|
||||||
type APIFileServer,
|
|
||||||
type FeishuServer,
|
|
||||||
type YuqueServer
|
|
||||||
} from '@fastgpt/global/core/dataset/apiDataset';
|
|
||||||
import ApiDatasetForm from '@/pageComponents/dataset/ApiDatasetForm';
|
import ApiDatasetForm from '@/pageComponents/dataset/ApiDatasetForm';
|
||||||
import { useContextSelector } from 'use-context-selector';
|
import { useContextSelector } from 'use-context-selector';
|
||||||
import { DatasetPageContext } from '@/web/core/dataset/context/datasetPageContext';
|
import { DatasetPageContext } from '@/web/core/dataset/context/datasetPageContext';
|
||||||
import { datasetTypeCourseMap } from '@/web/core/dataset/constants';
|
|
||||||
import { getDocPath } from '@/web/common/system/doc';
|
import { getDocPath } from '@/web/common/system/doc';
|
||||||
import MyIcon from '@fastgpt/web/components/common/Icon';
|
import MyIcon from '@fastgpt/web/components/common/Icon';
|
||||||
|
import type { ApiDatasetServerType } from '@fastgpt/global/core/dataset/apiDataset/type';
|
||||||
|
import { DatasetTypeMap } from '@fastgpt/global/core/dataset/constants';
|
||||||
|
|
||||||
export type EditAPIDatasetInfoFormType = {
|
export type EditAPIDatasetInfoFormType = {
|
||||||
id: string;
|
id: string;
|
||||||
apiServer?: APIFileServer;
|
apiDatasetServer?: ApiDatasetServerType;
|
||||||
yuqueServer?: YuqueServer;
|
|
||||||
feishuServer?: FeishuServer;
|
|
||||||
};
|
};
|
||||||
|
|
||||||
const EditAPIDatasetInfoModal = ({
|
const EditAPIDatasetInfoModal = ({
|
||||||
@ -60,7 +54,7 @@ const EditAPIDatasetInfoModal = ({
|
|||||||
return (
|
return (
|
||||||
<MyModal isOpen onClose={onClose} w={'450px'} iconSrc="modal/edit" title={title}>
|
<MyModal isOpen onClose={onClose} w={'450px'} iconSrc="modal/edit" title={title}>
|
||||||
<ModalBody>
|
<ModalBody>
|
||||||
{datasetTypeCourseMap[type] && (
|
{DatasetTypeMap[type]?.courseUrl && (
|
||||||
<Flex alignItems={'center'} justifyContent={'space-between'}>
|
<Flex alignItems={'center'} justifyContent={'space-between'}>
|
||||||
<Box color={'myGray.900'} fontSize={'sm'} fontWeight={500}>
|
<Box color={'myGray.900'} fontSize={'sm'} fontWeight={500}>
|
||||||
{t('dataset:apidataset_configuration')}
|
{t('dataset:apidataset_configuration')}
|
||||||
@ -71,7 +65,7 @@ const EditAPIDatasetInfoModal = ({
|
|||||||
color={'primary.600'}
|
color={'primary.600'}
|
||||||
fontSize={'sm'}
|
fontSize={'sm'}
|
||||||
cursor={'pointer'}
|
cursor={'pointer'}
|
||||||
onClick={() => window.open(getDocPath(datasetTypeCourseMap[type]), '_blank')}
|
onClick={() => window.open(getDocPath(DatasetTypeMap[type].courseUrl!), '_blank')}
|
||||||
>
|
>
|
||||||
<MyIcon name={'book'} w={4} mr={0.5} />
|
<MyIcon name={'book'} w={4} mr={0.5} />
|
||||||
{t('common:Instructions')}
|
{t('common:Instructions')}
|
||||||
|
|||||||
@ -311,12 +311,12 @@ const Info = ({ datasetId }: { datasetId: string }) => {
|
|||||||
onClick={() =>
|
onClick={() =>
|
||||||
setEditedAPIDataset({
|
setEditedAPIDataset({
|
||||||
id: datasetDetail._id,
|
id: datasetDetail._id,
|
||||||
apiServer: datasetDetail.apiServer
|
apiDatasetServer: datasetDetail.apiDatasetServer
|
||||||
})
|
})
|
||||||
}
|
}
|
||||||
/>
|
/>
|
||||||
</Flex>
|
</Flex>
|
||||||
<Box fontSize={'mini'}>{datasetDetail.apiServer?.baseUrl}</Box>
|
<Box fontSize={'mini'}>{datasetDetail.apiDatasetServer?.apiServer?.baseUrl}</Box>
|
||||||
</Box>
|
</Box>
|
||||||
</>
|
</>
|
||||||
)}
|
)}
|
||||||
@ -336,12 +336,12 @@ const Info = ({ datasetId }: { datasetId: string }) => {
|
|||||||
onClick={() =>
|
onClick={() =>
|
||||||
setEditedAPIDataset({
|
setEditedAPIDataset({
|
||||||
id: datasetDetail._id,
|
id: datasetDetail._id,
|
||||||
yuqueServer: datasetDetail.yuqueServer
|
apiDatasetServer: datasetDetail.apiDatasetServer
|
||||||
})
|
})
|
||||||
}
|
}
|
||||||
/>
|
/>
|
||||||
</Flex>
|
</Flex>
|
||||||
<Box fontSize={'mini'}>{datasetDetail.yuqueServer?.userId}</Box>
|
<Box fontSize={'mini'}>{datasetDetail.apiDatasetServer?.yuqueServer?.userId}</Box>
|
||||||
</Box>
|
</Box>
|
||||||
</>
|
</>
|
||||||
)}
|
)}
|
||||||
@ -361,12 +361,14 @@ const Info = ({ datasetId }: { datasetId: string }) => {
|
|||||||
onClick={() =>
|
onClick={() =>
|
||||||
setEditedAPIDataset({
|
setEditedAPIDataset({
|
||||||
id: datasetDetail._id,
|
id: datasetDetail._id,
|
||||||
feishuServer: datasetDetail.feishuServer
|
apiDatasetServer: datasetDetail.apiDatasetServer
|
||||||
})
|
})
|
||||||
}
|
}
|
||||||
/>
|
/>
|
||||||
</Flex>
|
</Flex>
|
||||||
<Box fontSize={'mini'}>{datasetDetail.feishuServer?.folderToken}</Box>
|
<Box fontSize={'mini'}>
|
||||||
|
{datasetDetail.apiDatasetServer?.feishuServer?.folderToken}
|
||||||
|
</Box>
|
||||||
</Box>
|
</Box>
|
||||||
</>
|
</>
|
||||||
)}
|
)}
|
||||||
@ -435,9 +437,7 @@ const Info = ({ datasetId }: { datasetId: string }) => {
|
|||||||
onEdit={(data) =>
|
onEdit={(data) =>
|
||||||
updateDataset({
|
updateDataset({
|
||||||
id: datasetId,
|
id: datasetId,
|
||||||
apiServer: data.apiServer,
|
apiDatasetServer: data.apiDatasetServer
|
||||||
yuqueServer: data.yuqueServer,
|
|
||||||
feishuServer: data.feishuServer
|
|
||||||
})
|
})
|
||||||
}
|
}
|
||||||
/>
|
/>
|
||||||
|
|||||||
@ -11,14 +11,13 @@ import MyModal from '@fastgpt/web/components/common/MyModal';
|
|||||||
import { postCreateDataset } from '@/web/core/dataset/api';
|
import { postCreateDataset } from '@/web/core/dataset/api';
|
||||||
import type { CreateDatasetParams } from '@/global/core/dataset/api.d';
|
import type { CreateDatasetParams } from '@/global/core/dataset/api.d';
|
||||||
import { useTranslation } from 'next-i18next';
|
import { useTranslation } from 'next-i18next';
|
||||||
import { DatasetTypeEnum } from '@fastgpt/global/core/dataset/constants';
|
import { DatasetTypeEnum, DatasetTypeMap } from '@fastgpt/global/core/dataset/constants';
|
||||||
import AIModelSelector from '@/components/Select/AIModelSelector';
|
import AIModelSelector from '@/components/Select/AIModelSelector';
|
||||||
import { useSystem } from '@fastgpt/web/hooks/useSystem';
|
import { useSystem } from '@fastgpt/web/hooks/useSystem';
|
||||||
import QuestionTip from '@fastgpt/web/components/common/MyTooltip/QuestionTip';
|
import QuestionTip from '@fastgpt/web/components/common/MyTooltip/QuestionTip';
|
||||||
import ComplianceTip from '@/components/common/ComplianceTip/index';
|
import ComplianceTip from '@/components/common/ComplianceTip/index';
|
||||||
import MyIcon from '@fastgpt/web/components/common/Icon';
|
import MyIcon from '@fastgpt/web/components/common/Icon';
|
||||||
import { getDocPath } from '@/web/common/system/doc';
|
import { getDocPath } from '@/web/common/system/doc';
|
||||||
import { datasetTypeCourseMap } from '@/web/core/dataset/constants';
|
|
||||||
import ApiDatasetForm from '../ApiDatasetForm';
|
import ApiDatasetForm from '../ApiDatasetForm';
|
||||||
import { getWebDefaultEmbeddingModel, getWebDefaultLLMModel } from '@/web/common/system/utils';
|
import { getWebDefaultEmbeddingModel, getWebDefaultLLMModel } from '@/web/common/system/utils';
|
||||||
|
|
||||||
@ -43,31 +42,6 @@ const CreateModal = ({
|
|||||||
const { defaultModels, embeddingModelList, datasetModelList, getVlmModelList } = useSystemStore();
|
const { defaultModels, embeddingModelList, datasetModelList, getVlmModelList } = useSystemStore();
|
||||||
const { isPc } = useSystem();
|
const { isPc } = useSystem();
|
||||||
|
|
||||||
const datasetTypeMap = useMemo(() => {
|
|
||||||
return {
|
|
||||||
[DatasetTypeEnum.dataset]: {
|
|
||||||
name: t('dataset:common_dataset'),
|
|
||||||
icon: 'core/dataset/commonDatasetColor'
|
|
||||||
},
|
|
||||||
[DatasetTypeEnum.websiteDataset]: {
|
|
||||||
name: t('dataset:website_dataset'),
|
|
||||||
icon: 'core/dataset/websiteDatasetColor'
|
|
||||||
},
|
|
||||||
[DatasetTypeEnum.apiDataset]: {
|
|
||||||
name: t('dataset:api_file'),
|
|
||||||
icon: 'core/dataset/externalDatasetColor'
|
|
||||||
},
|
|
||||||
[DatasetTypeEnum.feishu]: {
|
|
||||||
name: t('dataset:feishu_dataset'),
|
|
||||||
icon: 'core/dataset/feishuDatasetColor'
|
|
||||||
},
|
|
||||||
[DatasetTypeEnum.yuque]: {
|
|
||||||
name: t('dataset:yuque_dataset'),
|
|
||||||
icon: 'core/dataset/yuqueDatasetColor'
|
|
||||||
}
|
|
||||||
};
|
|
||||||
}, [t]);
|
|
||||||
|
|
||||||
const filterNotHiddenVectorModelList = embeddingModelList.filter((item) => !item.hidden);
|
const filterNotHiddenVectorModelList = embeddingModelList.filter((item) => !item.hidden);
|
||||||
|
|
||||||
const vllmModelList = useMemo(() => getVlmModelList(), [getVlmModelList]);
|
const vllmModelList = useMemo(() => getVlmModelList(), [getVlmModelList]);
|
||||||
@ -76,7 +50,7 @@ const CreateModal = ({
|
|||||||
defaultValues: {
|
defaultValues: {
|
||||||
parentId,
|
parentId,
|
||||||
type: type || DatasetTypeEnum.dataset,
|
type: type || DatasetTypeEnum.dataset,
|
||||||
avatar: datasetTypeMap[type].icon,
|
avatar: DatasetTypeMap[type].icon,
|
||||||
name: '',
|
name: '',
|
||||||
intro: '',
|
intro: '',
|
||||||
vectorModel:
|
vectorModel:
|
||||||
@ -121,10 +95,10 @@ const CreateModal = ({
|
|||||||
w={'20px'}
|
w={'20px'}
|
||||||
h={'20px'}
|
h={'20px'}
|
||||||
borderRadius={'xs'}
|
borderRadius={'xs'}
|
||||||
src={datasetTypeMap[type].icon}
|
src={DatasetTypeMap[type].icon}
|
||||||
pr={'10px'}
|
pr={'10px'}
|
||||||
/>
|
/>
|
||||||
{t('common:core.dataset.Create dataset', { name: datasetTypeMap[type].name })}
|
{t('common:core.dataset.Create dataset', { name: t(DatasetTypeMap[type].label) })}
|
||||||
</Flex>
|
</Flex>
|
||||||
}
|
}
|
||||||
isOpen
|
isOpen
|
||||||
@ -138,14 +112,14 @@ const CreateModal = ({
|
|||||||
<Box color={'myGray.900'} fontWeight={500} fontSize={'sm'}>
|
<Box color={'myGray.900'} fontWeight={500} fontSize={'sm'}>
|
||||||
{t('common:input_name')}
|
{t('common:input_name')}
|
||||||
</Box>
|
</Box>
|
||||||
{datasetTypeCourseMap[type] && (
|
{DatasetTypeMap[type]?.courseUrl && (
|
||||||
<Flex
|
<Flex
|
||||||
as={'span'}
|
as={'span'}
|
||||||
alignItems={'center'}
|
alignItems={'center'}
|
||||||
color={'primary.600'}
|
color={'primary.600'}
|
||||||
fontSize={'sm'}
|
fontSize={'sm'}
|
||||||
cursor={'pointer'}
|
cursor={'pointer'}
|
||||||
onClick={() => window.open(getDocPath(datasetTypeCourseMap[type]), '_blank')}
|
onClick={() => window.open(getDocPath(DatasetTypeMap[type].courseUrl!), '_blank')}
|
||||||
>
|
>
|
||||||
<MyIcon name={'book'} w={4} mr={0.5} />
|
<MyIcon name={'book'} w={4} mr={0.5} />
|
||||||
{t('common:Instructions')}
|
{t('common:Instructions')}
|
||||||
|
|||||||
@ -1,41 +1,14 @@
|
|||||||
import { Box, Flex, type FlexProps } from '@chakra-ui/react';
|
import { Box, Flex, type FlexProps } from '@chakra-ui/react';
|
||||||
import { DatasetTypeEnum } from '@fastgpt/global/core/dataset/constants';
|
import { DatasetTypeEnum, DatasetTypeMap } from '@fastgpt/global/core/dataset/constants';
|
||||||
import MyIcon from '@fastgpt/web/components/common/Icon';
|
import MyIcon from '@fastgpt/web/components/common/Icon';
|
||||||
import React, { useMemo } from 'react';
|
import React from 'react';
|
||||||
import { useTranslation } from 'next-i18next';
|
import { useTranslation } from 'next-i18next';
|
||||||
|
|
||||||
const SideTag = ({ type, ...props }: { type: `${DatasetTypeEnum}` } & FlexProps) => {
|
const SideTag = ({ type, ...props }: { type: `${DatasetTypeEnum}` } & FlexProps) => {
|
||||||
if (type === DatasetTypeEnum.folder) return null;
|
if (type === DatasetTypeEnum.folder) return null;
|
||||||
const { t } = useTranslation();
|
const { t } = useTranslation();
|
||||||
const DatasetListTypeMap = useMemo(() => {
|
|
||||||
return {
|
const item = DatasetTypeMap[type] || DatasetTypeMap['dataset'];
|
||||||
[DatasetTypeEnum.dataset]: {
|
|
||||||
icon: 'core/dataset/commonDatasetOutline',
|
|
||||||
label: t('dataset:common_dataset')
|
|
||||||
},
|
|
||||||
[DatasetTypeEnum.websiteDataset]: {
|
|
||||||
icon: 'core/dataset/websiteDatasetOutline',
|
|
||||||
label: t('dataset:website_dataset')
|
|
||||||
},
|
|
||||||
[DatasetTypeEnum.externalFile]: {
|
|
||||||
icon: 'core/dataset/externalDatasetOutline',
|
|
||||||
label: t('dataset:external_file')
|
|
||||||
},
|
|
||||||
[DatasetTypeEnum.apiDataset]: {
|
|
||||||
icon: 'core/dataset/externalDatasetOutline',
|
|
||||||
label: t('dataset:api_file')
|
|
||||||
},
|
|
||||||
[DatasetTypeEnum.feishu]: {
|
|
||||||
icon: 'core/dataset/feishuDatasetOutline',
|
|
||||||
label: t('dataset:feishu_dataset')
|
|
||||||
},
|
|
||||||
[DatasetTypeEnum.yuque]: {
|
|
||||||
icon: 'core/dataset/yuqueDatasetOutline',
|
|
||||||
label: t('dataset:yuque_dataset')
|
|
||||||
}
|
|
||||||
};
|
|
||||||
}, [t]);
|
|
||||||
const item = DatasetListTypeMap[type] || DatasetListTypeMap['dataset'];
|
|
||||||
|
|
||||||
return (
|
return (
|
||||||
<Flex
|
<Flex
|
||||||
@ -50,7 +23,7 @@ const SideTag = ({ type, ...props }: { type: `${DatasetTypeEnum}` } & FlexProps)
|
|||||||
>
|
>
|
||||||
<MyIcon name={item.icon as any} w={'0.8rem'} color={'myGray.400'} />
|
<MyIcon name={item.icon as any} w={'0.8rem'} color={'myGray.400'} />
|
||||||
<Box fontSize={'mini'} ml={1}>
|
<Box fontSize={'mini'} ml={1}>
|
||||||
{item.label}
|
{t(item.label)}
|
||||||
</Box>
|
</Box>
|
||||||
</Flex>
|
</Flex>
|
||||||
);
|
);
|
||||||
|
|||||||
38
projects/app/src/pages/api/admin/initv4911.ts
Normal file
@ -0,0 +1,38 @@
|
|||||||
|
import { NextAPI } from '@/service/middleware/entry';
|
||||||
|
import { authCert } from '@fastgpt/service/support/permission/auth/common';
|
||||||
|
import { type NextApiRequest, type NextApiResponse } from 'next';
|
||||||
|
|
||||||
|
import { MongoDataset } from '@fastgpt/service/core/dataset/schema';
|
||||||
|
import { DatasetTypeEnum } from '@fastgpt/global/core/dataset/constants';
|
||||||
|
|
||||||
|
async function handler(req: NextApiRequest, _res: NextApiResponse) {
|
||||||
|
await authCert({ req, authRoot: true });
|
||||||
|
|
||||||
|
console.log('更新所有 API 知识库');
|
||||||
|
|
||||||
|
const datasets = await MongoDataset.find({
|
||||||
|
type: {
|
||||||
|
$in: [DatasetTypeEnum.apiDataset, DatasetTypeEnum.feishu, DatasetTypeEnum.yuque]
|
||||||
|
}
|
||||||
|
}).lean();
|
||||||
|
|
||||||
|
for (const dataset of datasets) {
|
||||||
|
console.log(dataset._id);
|
||||||
|
await MongoDataset.updateOne(
|
||||||
|
{ _id: dataset._id },
|
||||||
|
{
|
||||||
|
$set: {
|
||||||
|
apiDatasetServer: {
|
||||||
|
...(dataset.apiServer && { apiServer: dataset.apiServer }),
|
||||||
|
...(dataset.feishuServer && { feishuServer: dataset.feishuServer }),
|
||||||
|
...(dataset.yuqueServer && { yuqueServer: dataset.yuqueServer })
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
return { success: true };
|
||||||
|
}
|
||||||
|
|
||||||
|
export default NextAPI(handler);
|
||||||
@ -1,35 +1,34 @@
|
|||||||
import { getApiDatasetRequest } from '@fastgpt/service/core/dataset/apiDataset';
|
import { getApiDatasetRequest } from '@fastgpt/service/core/dataset/apiDataset';
|
||||||
import { NextAPI } from '@/service/middleware/entry';
|
import { NextAPI } from '@/service/middleware/entry';
|
||||||
import type { ParentIdType } from '@fastgpt/global/common/parentFolder/type';
|
import type { ParentIdType } from '@fastgpt/global/common/parentFolder/type';
|
||||||
import type {
|
|
||||||
APIFileItem,
|
|
||||||
APIFileServer,
|
|
||||||
YuqueServer,
|
|
||||||
FeishuServer
|
|
||||||
} from '@fastgpt/global/core/dataset/apiDataset';
|
|
||||||
import { type NextApiRequest } from 'next';
|
import { type NextApiRequest } from 'next';
|
||||||
import { authCert } from '@fastgpt/service/support/permission/auth/common';
|
import { authCert } from '@fastgpt/service/support/permission/auth/common';
|
||||||
|
import type {
|
||||||
|
ApiDatasetServerType,
|
||||||
|
APIFileItem
|
||||||
|
} from '@fastgpt/global/core/dataset/apiDataset/type';
|
||||||
|
|
||||||
export type GetApiDatasetCataLogProps = {
|
export type GetApiDatasetCataLogProps = {
|
||||||
parentId?: ParentIdType;
|
parentId?: ParentIdType;
|
||||||
yuqueServer?: YuqueServer;
|
apiDatasetServer?: ApiDatasetServerType;
|
||||||
feishuServer?: FeishuServer;
|
|
||||||
apiServer?: APIFileServer;
|
|
||||||
};
|
};
|
||||||
|
|
||||||
export type GetApiDatasetCataLogResponse = APIFileItem[];
|
export type GetApiDatasetCataLogResponse = APIFileItem[];
|
||||||
|
|
||||||
async function handler(req: NextApiRequest) {
|
async function handler(req: NextApiRequest) {
|
||||||
let { searchKey = '', parentId = null, yuqueServer, feishuServer, apiServer } = req.body;
|
let { searchKey = '', parentId = null, apiDatasetServer } = req.body;
|
||||||
|
|
||||||
await authCert({ req, authToken: true });
|
await authCert({ req, authToken: true });
|
||||||
|
|
||||||
|
// Remove basePath from apiDatasetServer
|
||||||
|
Object.values(apiDatasetServer).forEach((server: any) => {
|
||||||
|
if (server.basePath) {
|
||||||
|
delete server.basePath;
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
const data = await (
|
const data = await (
|
||||||
await getApiDatasetRequest({
|
await getApiDatasetRequest(apiDatasetServer)
|
||||||
feishuServer,
|
|
||||||
yuqueServer,
|
|
||||||
apiServer
|
|
||||||
})
|
|
||||||
).listFiles({ parentId, searchKey });
|
).listFiles({ parentId, searchKey });
|
||||||
|
|
||||||
return data?.filter((item: APIFileItem) => item.hasChild === true) || [];
|
return data?.filter((item: APIFileItem) => item.hasChild === true) || [];
|
||||||
|
|||||||
@ -2,11 +2,9 @@ import { NextAPI } from '@/service/middleware/entry';
|
|||||||
import { DatasetErrEnum } from '@fastgpt/global/common/error/code/dataset';
|
import { DatasetErrEnum } from '@fastgpt/global/common/error/code/dataset';
|
||||||
import type { ParentIdType } from '@fastgpt/global/common/parentFolder/type';
|
import type { ParentIdType } from '@fastgpt/global/common/parentFolder/type';
|
||||||
import type {
|
import type {
|
||||||
APIFileServer,
|
ApiDatasetDetailResponse,
|
||||||
YuqueServer,
|
ApiDatasetServerType
|
||||||
FeishuServer,
|
} from '@fastgpt/global/core/dataset/apiDataset/type';
|
||||||
ApiDatasetDetailResponse
|
|
||||||
} from '@fastgpt/global/core/dataset/apiDataset';
|
|
||||||
import { getApiDatasetRequest } from '@fastgpt/service/core/dataset/apiDataset';
|
import { getApiDatasetRequest } from '@fastgpt/service/core/dataset/apiDataset';
|
||||||
import type { ApiRequestProps, ApiResponseType } from '@fastgpt/service/type/next';
|
import type { ApiRequestProps, ApiResponseType } from '@fastgpt/service/type/next';
|
||||||
import { authCert } from '@fastgpt/service/support/permission/auth/common';
|
import { authCert } from '@fastgpt/service/support/permission/auth/common';
|
||||||
@ -18,9 +16,7 @@ export type GetApiDatasetPathQuery = {};
|
|||||||
export type GetApiDatasetPathBody = {
|
export type GetApiDatasetPathBody = {
|
||||||
datasetId?: string;
|
datasetId?: string;
|
||||||
parentId?: ParentIdType;
|
parentId?: ParentIdType;
|
||||||
yuqueServer?: YuqueServer;
|
apiDatasetServer?: ApiDatasetServerType;
|
||||||
feishuServer?: FeishuServer;
|
|
||||||
apiServer?: APIFileServer;
|
|
||||||
};
|
};
|
||||||
|
|
||||||
export type GetApiDatasetPathResponse = string;
|
export type GetApiDatasetPathResponse = string;
|
||||||
@ -50,7 +46,7 @@ async function handler(
|
|||||||
const { datasetId, parentId } = req.body;
|
const { datasetId, parentId } = req.body;
|
||||||
if (!parentId) return '';
|
if (!parentId) return '';
|
||||||
|
|
||||||
const { yuqueServer, feishuServer, apiServer } = await (async () => {
|
const apiDatasetServer = await (async () => {
|
||||||
if (datasetId) {
|
if (datasetId) {
|
||||||
const { dataset } = await authDataset({
|
const { dataset } = await authDataset({
|
||||||
req,
|
req,
|
||||||
@ -60,40 +56,15 @@ async function handler(
|
|||||||
datasetId
|
datasetId
|
||||||
});
|
});
|
||||||
|
|
||||||
return {
|
return dataset.apiDatasetServer;
|
||||||
yuqueServer: req.body.yuqueServer
|
|
||||||
? { ...req.body.yuqueServer, token: dataset.yuqueServer?.token ?? '' }
|
|
||||||
: dataset.yuqueServer,
|
|
||||||
feishuServer: req.body.feishuServer
|
|
||||||
? { ...req.body.feishuServer, appSecret: dataset.feishuServer?.appSecret ?? '' }
|
|
||||||
: dataset.feishuServer,
|
|
||||||
apiServer: req.body.apiServer
|
|
||||||
? {
|
|
||||||
...req.body.apiServer,
|
|
||||||
authorization: dataset.apiServer?.authorization ?? ''
|
|
||||||
}
|
|
||||||
: dataset.apiServer
|
|
||||||
};
|
|
||||||
} else {
|
} else {
|
||||||
await authCert({ req, authToken: true });
|
await authCert({ req, authToken: true });
|
||||||
|
|
||||||
return {
|
return req.body.apiDatasetServer;
|
||||||
yuqueServer: req.body.yuqueServer,
|
|
||||||
feishuServer: req.body.feishuServer,
|
|
||||||
apiServer: req.body.apiServer
|
|
||||||
};
|
|
||||||
}
|
}
|
||||||
})();
|
})();
|
||||||
|
|
||||||
if (feishuServer) {
|
const apiDataset = await getApiDatasetRequest(apiDatasetServer);
|
||||||
return '';
|
|
||||||
}
|
|
||||||
|
|
||||||
if (yuqueServer || apiServer) {
|
|
||||||
const apiDataset = await getApiDatasetRequest({
|
|
||||||
yuqueServer,
|
|
||||||
apiServer
|
|
||||||
});
|
|
||||||
|
|
||||||
if (!apiDataset?.getFileDetail) {
|
if (!apiDataset?.getFileDetail) {
|
||||||
return Promise.reject(DatasetErrEnum.noApiServer);
|
return Promise.reject(DatasetErrEnum.noApiServer);
|
||||||
@ -102,7 +73,4 @@ async function handler(
|
|||||||
return await getFullPath(parentId, apiDataset.getFileDetail);
|
return await getFullPath(parentId, apiDataset.getFileDetail);
|
||||||
}
|
}
|
||||||
|
|
||||||
return Promise.reject(new Error(DatasetErrEnum.noApiServer));
|
|
||||||
}
|
|
||||||
|
|
||||||
export default NextAPI(handler);
|
export default NextAPI(handler);
|
||||||
|
|||||||
@ -23,17 +23,7 @@ async function handler(req: NextApiRequest) {
|
|||||||
per: ReadPermissionVal
|
per: ReadPermissionVal
|
||||||
});
|
});
|
||||||
|
|
||||||
const apiServer = dataset.apiServer;
|
return (await getApiDatasetRequest(dataset.apiDatasetServer)).listFiles({ searchKey, parentId });
|
||||||
const feishuServer = dataset.feishuServer;
|
|
||||||
const yuqueServer = dataset.yuqueServer;
|
|
||||||
|
|
||||||
return (
|
|
||||||
await getApiDatasetRequest({
|
|
||||||
apiServer,
|
|
||||||
yuqueServer,
|
|
||||||
feishuServer
|
|
||||||
})
|
|
||||||
).listFiles({ searchKey, parentId });
|
|
||||||
}
|
}
|
||||||
|
|
||||||
export default NextAPI(handler);
|
export default NextAPI(handler);
|
||||||
|
|||||||
@ -23,10 +23,6 @@ async function handler(req: NextApiRequest): CreateCollectionResponse {
|
|||||||
per: WritePermissionVal
|
per: WritePermissionVal
|
||||||
});
|
});
|
||||||
|
|
||||||
const apiServer = dataset.apiServer;
|
|
||||||
const feishuServer = dataset.feishuServer;
|
|
||||||
const yuqueServer = dataset.yuqueServer;
|
|
||||||
|
|
||||||
// Auth same apiFileId
|
// Auth same apiFileId
|
||||||
const storeCol = await MongoDatasetCollection.findOne(
|
const storeCol = await MongoDatasetCollection.findOne(
|
||||||
{
|
{
|
||||||
@ -42,9 +38,7 @@ async function handler(req: NextApiRequest): CreateCollectionResponse {
|
|||||||
}
|
}
|
||||||
|
|
||||||
const { title, rawText } = await readApiServerFileContent({
|
const { title, rawText } = await readApiServerFileContent({
|
||||||
apiServer,
|
apiDatasetServer: dataset.apiDatasetServer,
|
||||||
feishuServer,
|
|
||||||
yuqueServer,
|
|
||||||
apiFileId,
|
apiFileId,
|
||||||
teamId,
|
teamId,
|
||||||
tmbId,
|
tmbId,
|
||||||
|
|||||||
@ -62,9 +62,7 @@ async function handler(
|
|||||||
return {
|
return {
|
||||||
type: DatasetSourceReadTypeEnum.apiFile,
|
type: DatasetSourceReadTypeEnum.apiFile,
|
||||||
sourceId: collection.apiFileId,
|
sourceId: collection.apiFileId,
|
||||||
apiServer: collection.dataset.apiServer,
|
apiDatasetServer: collection.dataset.apiDatasetServer
|
||||||
feishuServer: collection.dataset.feishuServer,
|
|
||||||
yuqueServer: collection.dataset.yuqueServer
|
|
||||||
};
|
};
|
||||||
}
|
}
|
||||||
if (collection.type === DatasetCollectionTypeEnum.externalFile) {
|
if (collection.type === DatasetCollectionTypeEnum.externalFile) {
|
||||||
|
|||||||
@ -94,17 +94,7 @@ async function handler(
|
|||||||
return collection.rawLink;
|
return collection.rawLink;
|
||||||
}
|
}
|
||||||
if (collection.type === DatasetCollectionTypeEnum.apiFile && collection.apiFileId) {
|
if (collection.type === DatasetCollectionTypeEnum.apiFile && collection.apiFileId) {
|
||||||
const apiServer = collection.dataset.apiServer;
|
return (await getApiDatasetRequest(collection.dataset.apiDatasetServer)).getFilePreviewUrl({
|
||||||
const feishuServer = collection.dataset.feishuServer;
|
|
||||||
const yuqueServer = collection.dataset.yuqueServer;
|
|
||||||
|
|
||||||
return (
|
|
||||||
await getApiDatasetRequest({
|
|
||||||
apiServer,
|
|
||||||
feishuServer,
|
|
||||||
yuqueServer
|
|
||||||
})
|
|
||||||
).getFilePreviewUrl({
|
|
||||||
apiFileId: collection.apiFileId
|
apiFileId: collection.apiFileId
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
|
|||||||
@ -38,9 +38,7 @@ async function handler(
|
|||||||
vectorModel = getDefaultEmbeddingModel()?.model,
|
vectorModel = getDefaultEmbeddingModel()?.model,
|
||||||
agentModel = getDatasetModel()?.model,
|
agentModel = getDatasetModel()?.model,
|
||||||
vlmModel,
|
vlmModel,
|
||||||
apiServer,
|
apiDatasetServer
|
||||||
feishuServer,
|
|
||||||
yuqueServer
|
|
||||||
} = req.body;
|
} = req.body;
|
||||||
|
|
||||||
// auth
|
// auth
|
||||||
@ -86,9 +84,7 @@ async function handler(
|
|||||||
vlmModel,
|
vlmModel,
|
||||||
avatar,
|
avatar,
|
||||||
type,
|
type,
|
||||||
apiServer,
|
apiDatasetServer
|
||||||
feishuServer,
|
|
||||||
yuqueServer
|
|
||||||
}
|
}
|
||||||
],
|
],
|
||||||
{ session, ordered: true }
|
{ session, ordered: true }
|
||||||
|
|||||||
@ -7,6 +7,7 @@ import { type ApiRequestProps } from '@fastgpt/service/type/next';
|
|||||||
import { CommonErrEnum } from '@fastgpt/global/common/error/code/common';
|
import { CommonErrEnum } from '@fastgpt/global/common/error/code/common';
|
||||||
import { getWebsiteSyncDatasetStatus } from '@fastgpt/service/core/dataset/websiteSync';
|
import { getWebsiteSyncDatasetStatus } from '@fastgpt/service/core/dataset/websiteSync';
|
||||||
import { DatasetStatusEnum, DatasetTypeEnum } from '@fastgpt/global/core/dataset/constants';
|
import { DatasetStatusEnum, DatasetTypeEnum } from '@fastgpt/global/core/dataset/constants';
|
||||||
|
import { filterApiDatasetServerPublicData } from '@fastgpt/global/core/dataset/apiDataset/utils';
|
||||||
|
|
||||||
type Query = {
|
type Query = {
|
||||||
id: string;
|
id: string;
|
||||||
@ -40,36 +41,16 @@ async function handler(req: ApiRequestProps<Query>): Promise<DatasetItemType> {
|
|||||||
errorMsg: undefined
|
errorMsg: undefined
|
||||||
};
|
};
|
||||||
})();
|
})();
|
||||||
|
console.log(filterApiDatasetServerPublicData(dataset.apiDatasetServer));
|
||||||
return {
|
return {
|
||||||
...dataset,
|
...dataset,
|
||||||
status,
|
status,
|
||||||
errorMsg,
|
errorMsg,
|
||||||
apiServer: dataset.apiServer
|
|
||||||
? {
|
|
||||||
baseUrl: dataset.apiServer.baseUrl,
|
|
||||||
authorization: '',
|
|
||||||
basePath: dataset.apiServer.basePath
|
|
||||||
}
|
|
||||||
: undefined,
|
|
||||||
yuqueServer: dataset.yuqueServer
|
|
||||||
? {
|
|
||||||
userId: dataset.yuqueServer.userId,
|
|
||||||
token: '',
|
|
||||||
basePath: dataset.yuqueServer.basePath
|
|
||||||
}
|
|
||||||
: undefined,
|
|
||||||
feishuServer: dataset.feishuServer
|
|
||||||
? {
|
|
||||||
appId: dataset.feishuServer.appId,
|
|
||||||
appSecret: '',
|
|
||||||
folderToken: dataset.feishuServer.folderToken
|
|
||||||
}
|
|
||||||
: undefined,
|
|
||||||
permission,
|
permission,
|
||||||
vectorModel: getEmbeddingModel(dataset.vectorModel),
|
vectorModel: getEmbeddingModel(dataset.vectorModel),
|
||||||
agentModel: getLLMModel(dataset.agentModel),
|
agentModel: getLLMModel(dataset.agentModel),
|
||||||
vlmModel: getVlmModel(dataset.vlmModel)
|
vlmModel: getVlmModel(dataset.vlmModel),
|
||||||
|
apiDatasetServer: filterApiDatasetServerPublicData(dataset.apiDatasetServer)
|
||||||
};
|
};
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|||||||
@ -121,11 +121,9 @@ async function handler(
|
|||||||
type,
|
type,
|
||||||
sourceId,
|
sourceId,
|
||||||
selector,
|
selector,
|
||||||
apiServer: dataset.apiServer,
|
|
||||||
feishuServer: dataset.feishuServer,
|
|
||||||
yuqueServer: dataset.yuqueServer,
|
|
||||||
externalFileId,
|
externalFileId,
|
||||||
customPdfParse
|
customPdfParse,
|
||||||
|
apiDatasetServer: dataset.apiDatasetServer
|
||||||
});
|
});
|
||||||
|
|
||||||
const chunks = rawText2Chunks({
|
const chunks = rawText2Chunks({
|
||||||
|
|||||||
@ -69,9 +69,7 @@ async function handler(
|
|||||||
vlmModel,
|
vlmModel,
|
||||||
websiteConfig,
|
websiteConfig,
|
||||||
externalReadUrl,
|
externalReadUrl,
|
||||||
apiServer,
|
apiDatasetServer,
|
||||||
yuqueServer,
|
|
||||||
feishuServer,
|
|
||||||
autoSync,
|
autoSync,
|
||||||
chunkSettings
|
chunkSettings
|
||||||
} = req.body;
|
} = req.body;
|
||||||
@ -168,6 +166,37 @@ async function handler(
|
|||||||
await delDatasetRelevantData({ datasets: [dataset], session });
|
await delDatasetRelevantData({ datasets: [dataset], session });
|
||||||
}
|
}
|
||||||
|
|
||||||
|
const apiDatasetParams = (() => {
|
||||||
|
if (!apiDatasetServer) return {};
|
||||||
|
|
||||||
|
const flattenObjectWithConditions = (
|
||||||
|
obj: any,
|
||||||
|
prefix = 'apiDatasetServer'
|
||||||
|
): Record<string, any> => {
|
||||||
|
const result: Record<string, any> = {};
|
||||||
|
|
||||||
|
if (!obj || typeof obj !== 'object') return result;
|
||||||
|
|
||||||
|
Object.keys(obj).forEach((key) => {
|
||||||
|
const value = obj[key];
|
||||||
|
const newKey = prefix ? `${prefix}.${key}` : key;
|
||||||
|
|
||||||
|
if (value) {
|
||||||
|
if (typeof value === 'object' && !Array.isArray(value)) {
|
||||||
|
// Recursively flatten nested objects
|
||||||
|
Object.assign(result, flattenObjectWithConditions(value, newKey));
|
||||||
|
} else {
|
||||||
|
// Add non-empty primitive values
|
||||||
|
result[newKey] = value;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
return result;
|
||||||
|
};
|
||||||
|
return flattenObjectWithConditions(apiDatasetServer);
|
||||||
|
})();
|
||||||
|
|
||||||
await MongoDataset.findByIdAndUpdate(
|
await MongoDataset.findByIdAndUpdate(
|
||||||
id,
|
id,
|
||||||
{
|
{
|
||||||
@ -180,23 +209,9 @@ async function handler(
|
|||||||
...(chunkSettings && { chunkSettings }),
|
...(chunkSettings && { chunkSettings }),
|
||||||
...(intro !== undefined && { intro }),
|
...(intro !== undefined && { intro }),
|
||||||
...(externalReadUrl !== undefined && { externalReadUrl }),
|
...(externalReadUrl !== undefined && { externalReadUrl }),
|
||||||
...(!!apiServer?.baseUrl && { 'apiServer.baseUrl': apiServer.baseUrl }),
|
|
||||||
...(!!apiServer?.authorization && {
|
|
||||||
'apiServer.authorization': apiServer.authorization
|
|
||||||
}),
|
|
||||||
...(!!apiServer?.basePath !== undefined && { 'apiServer.basePath': apiServer?.basePath }),
|
|
||||||
...(!!yuqueServer?.userId && { 'yuqueServer.userId': yuqueServer.userId }),
|
|
||||||
...(!!yuqueServer?.token && { 'yuqueServer.token': yuqueServer.token }),
|
|
||||||
...(!!yuqueServer?.basePath !== undefined && {
|
|
||||||
'yuqueServer.basePath': yuqueServer?.basePath
|
|
||||||
}),
|
|
||||||
...(!!feishuServer?.appId && { 'feishuServer.appId': feishuServer.appId }),
|
|
||||||
...(!!feishuServer?.appSecret && { 'feishuServer.appSecret': feishuServer.appSecret }),
|
|
||||||
...(!!feishuServer?.folderToken && {
|
|
||||||
'feishuServer.folderToken': feishuServer.folderToken
|
|
||||||
}),
|
|
||||||
...(isMove && { inheritPermission: true }),
|
...(isMove && { inheritPermission: true }),
|
||||||
...(typeof autoSync === 'boolean' && { autoSync })
|
...(typeof autoSync === 'boolean' && { autoSync }),
|
||||||
|
...apiDatasetParams
|
||||||
},
|
},
|
||||||
{ session }
|
{ session }
|
||||||
);
|
);
|
||||||
|
|||||||
@ -69,7 +69,7 @@ import type {
|
|||||||
getTrainingErrorBody,
|
getTrainingErrorBody,
|
||||||
getTrainingErrorResponse
|
getTrainingErrorResponse
|
||||||
} from '@/pages/api/core/dataset/training/getTrainingError';
|
} from '@/pages/api/core/dataset/training/getTrainingError';
|
||||||
import type { APIFileItem } from '@fastgpt/global/core/dataset/apiDataset';
|
import type { APIFileItem } from '@fastgpt/global/core/dataset/apiDataset/type';
|
||||||
import type { GetQuoteDataProps } from '@/pages/api/core/dataset/data/getQuoteData';
|
import type { GetQuoteDataProps } from '@/pages/api/core/dataset/data/getQuoteData';
|
||||||
import type {
|
import type {
|
||||||
GetApiDatasetCataLogResponse,
|
GetApiDatasetCataLogResponse,
|
||||||
|
|||||||
@ -2,8 +2,7 @@ import { defaultQAModels, defaultVectorModels } from '@fastgpt/global/core/ai/mo
|
|||||||
import {
|
import {
|
||||||
DatasetCollectionDataProcessModeEnum,
|
DatasetCollectionDataProcessModeEnum,
|
||||||
DatasetCollectionTypeEnum,
|
DatasetCollectionTypeEnum,
|
||||||
DatasetTypeEnum,
|
DatasetTypeEnum
|
||||||
TrainingModeEnum
|
|
||||||
} from '@fastgpt/global/core/dataset/constants';
|
} from '@fastgpt/global/core/dataset/constants';
|
||||||
import type {
|
import type {
|
||||||
DatasetCollectionItemType,
|
DatasetCollectionItemType,
|
||||||
@ -66,16 +65,6 @@ export const defaultCollectionDetail: DatasetCollectionItemType = {
|
|||||||
indexAmount: 0
|
indexAmount: 0
|
||||||
};
|
};
|
||||||
|
|
||||||
export const datasetTypeCourseMap: Record<`${DatasetTypeEnum}`, string> = {
|
|
||||||
[DatasetTypeEnum.folder]: '',
|
|
||||||
[DatasetTypeEnum.dataset]: '',
|
|
||||||
[DatasetTypeEnum.apiDataset]: '/docs/guide/knowledge_base/api_dataset/',
|
|
||||||
[DatasetTypeEnum.websiteDataset]: '/docs/guide/knowledge_base/websync/',
|
|
||||||
[DatasetTypeEnum.feishu]: '/docs/guide/knowledge_base/lark_dataset/',
|
|
||||||
[DatasetTypeEnum.yuque]: '/docs/guide/knowledge_base/yuque_dataset/',
|
|
||||||
[DatasetTypeEnum.externalFile]: ''
|
|
||||||
};
|
|
||||||
|
|
||||||
export const TrainingProcess = {
|
export const TrainingProcess = {
|
||||||
waiting: {
|
waiting: {
|
||||||
label: i18nT('dataset:process.Waiting'),
|
label: i18nT('dataset:process.Waiting'),
|
||||||
|
|||||||
@ -18,6 +18,7 @@ import { useSystemStore } from '@/web/common/system/useSystemStore';
|
|||||||
import { type ParentTreePathItemType } from '@fastgpt/global/common/parentFolder/type';
|
import { type ParentTreePathItemType } from '@fastgpt/global/common/parentFolder/type';
|
||||||
import { useRequest2 } from '@fastgpt/web/hooks/useRequest';
|
import { useRequest2 } from '@fastgpt/web/hooks/useRequest';
|
||||||
import { getWebLLMModel } from '@/web/common/system/utils';
|
import { getWebLLMModel } from '@/web/common/system/utils';
|
||||||
|
import { filterApiDatasetServerPublicData } from '@fastgpt/global/core/dataset/apiDataset/utils';
|
||||||
|
|
||||||
type DatasetPageContextType = {
|
type DatasetPageContextType = {
|
||||||
datasetId: string;
|
datasetId: string;
|
||||||
@ -103,27 +104,7 @@ export const DatasetPageContextProvider = ({
|
|||||||
...data,
|
...data,
|
||||||
agentModel: data.agentModel ? getWebLLMModel(data.agentModel) : state.agentModel,
|
agentModel: data.agentModel ? getWebLLMModel(data.agentModel) : state.agentModel,
|
||||||
vlmModel: data.vlmModel ? getWebLLMModel(data.vlmModel) : state.vlmModel,
|
vlmModel: data.vlmModel ? getWebLLMModel(data.vlmModel) : state.vlmModel,
|
||||||
apiServer: data.apiServer
|
apiDatasetServer: filterApiDatasetServerPublicData(data.apiDatasetServer)
|
||||||
? {
|
|
||||||
baseUrl: data.apiServer.baseUrl,
|
|
||||||
authorization: '',
|
|
||||||
basePath: data.apiServer.basePath
|
|
||||||
}
|
|
||||||
: state.apiServer,
|
|
||||||
yuqueServer: data.yuqueServer
|
|
||||||
? {
|
|
||||||
userId: data.yuqueServer.userId,
|
|
||||||
token: '',
|
|
||||||
basePath: data.yuqueServer.basePath
|
|
||||||
}
|
|
||||||
: state.yuqueServer,
|
|
||||||
feishuServer: data.feishuServer
|
|
||||||
? {
|
|
||||||
appId: data.feishuServer.appId,
|
|
||||||
appSecret: '',
|
|
||||||
folderToken: data.feishuServer.folderToken
|
|
||||||
}
|
|
||||||
: state.feishuServer
|
|
||||||
}));
|
}));
|
||||||
}
|
}
|
||||||
};
|
};
|
||||||
|
|||||||
2
projects/app/src/web/core/dataset/type.d.ts
vendored
@ -2,7 +2,7 @@ import type { PushDatasetDataChunkProps } from '@fastgpt/global/core/dataset/api
|
|||||||
import type { TrainingModeEnum } from '@fastgpt/global/core/dataset/constants';
|
import type { TrainingModeEnum } from '@fastgpt/global/core/dataset/constants';
|
||||||
import type { ChunkSettingModeEnum } from '@fastgpt/global/core/dataset/constants';
|
import type { ChunkSettingModeEnum } from '@fastgpt/global/core/dataset/constants';
|
||||||
import type { UseFormReturn } from 'react-hook-form';
|
import type { UseFormReturn } from 'react-hook-form';
|
||||||
import type { APIFileItem } from '@fastgpt/global/core/dataset/apiDataset';
|
import type { APIFileItem } from '@fastgpt/global/core/dataset/apiDataset/type';
|
||||||
|
|
||||||
export type ImportSourceItemType = {
|
export type ImportSourceItemType = {
|
||||||
id: string;
|
id: string;
|
||||||
|
|||||||