perf: api dataset code
|
Before Width: | Height: | Size: 97 KiB After Width: | Height: | Size: 42 KiB |
|
Before Width: | Height: | Size: 14 KiB After Width: | Height: | Size: 6.0 KiB |
|
Before Width: | Height: | Size: 151 KiB After Width: | Height: | Size: 64 KiB |
|
Before Width: | Height: | Size: 169 KiB After Width: | Height: | Size: 73 KiB |
|
Before Width: | Height: | Size: 145 KiB After Width: | Height: | Size: 62 KiB |
|
Before Width: | Height: | Size: 61 KiB After Width: | Height: | Size: 26 KiB |
|
Before Width: | Height: | Size: 74 KiB After Width: | Height: | Size: 29 KiB |
|
Before Width: | Height: | Size: 77 KiB After Width: | Height: | Size: 33 KiB |
|
Before Width: | Height: | Size: 90 KiB After Width: | Height: | Size: 49 KiB |
|
Before Width: | Height: | Size: 159 KiB After Width: | Height: | Size: 69 KiB |
|
Before Width: | Height: | Size: 128 KiB After Width: | Height: | Size: 40 KiB |
|
Before Width: | Height: | Size: 95 KiB After Width: | Height: | Size: 40 KiB |
|
Before Width: | Height: | Size: 151 KiB After Width: | Height: | Size: 66 KiB |
|
Before Width: | Height: | Size: 129 KiB After Width: | Height: | Size: 57 KiB |
|
Before Width: | Height: | Size: 180 KiB After Width: | Height: | Size: 78 KiB |
|
Before Width: | Height: | Size: 231 KiB After Width: | Height: | Size: 103 KiB |
|
Before Width: | Height: | Size: 77 KiB After Width: | Height: | Size: 43 KiB |
|
Before Width: | Height: | Size: 95 KiB After Width: | Height: | Size: 41 KiB |
|
Before Width: | Height: | Size: 68 KiB After Width: | Height: | Size: 35 KiB |
|
Before Width: | Height: | Size: 94 KiB After Width: | Height: | Size: 38 KiB |
|
Before Width: | Height: | Size: 72 KiB After Width: | Height: | Size: 28 KiB |
|
Before Width: | Height: | Size: 150 KiB After Width: | Height: | Size: 64 KiB |
|
Before Width: | Height: | Size: 238 KiB After Width: | Height: | Size: 110 KiB |
@ -1,5 +1,5 @@
|
||||
---
|
||||
title: 'V4.9.1'
|
||||
title: 'V4.9.1(包含升级脚本)'
|
||||
description: 'FastGPT V4.9.1 更新说明'
|
||||
icon: 'upgrade'
|
||||
draft: false
|
||||
|
||||
@ -7,6 +7,21 @@ toc: true
|
||||
weight: 789
|
||||
---
|
||||
|
||||
## 执行升级脚本
|
||||
|
||||
该脚本仅需商业版用户执行。
|
||||
|
||||
从任意终端,发起 1 个 HTTP 请求。其中 {{rootkey}} 替换成环境变量里的 `rootkey`;{{host}} 替换成**FastGPT 域名**。
|
||||
|
||||
```bash
|
||||
curl --location --request POST 'https://{{host}}/api/admin/initv4911' \
|
||||
--header 'rootkey: {{rootkey}}' \
|
||||
--header 'Content-Type: application/json'
|
||||
```
|
||||
|
||||
**脚本功能**
|
||||
|
||||
1. 移动第三方知识库 API 配置。
|
||||
|
||||
## 🚀 新增内容
|
||||
|
||||
|
||||
@ -1,5 +1,5 @@
|
||||
---
|
||||
title: 'V4.9.4'
|
||||
title: 'V4.9.4(包含升级脚本)'
|
||||
description: 'FastGPT V4.9.4 更新说明'
|
||||
icon: 'upgrade'
|
||||
draft: false
|
||||
|
||||
@ -7,142 +7,55 @@ toc: true
|
||||
weight: 410
|
||||
---
|
||||
|
||||
目前,互联网上拥有各种各样的文档库,例如飞书,语雀等等。 FastGPT 的不同用户可能使用的文档库不同,然而开发人手不够,FastGPT 目前只支持飞书,语雀,api ,web 站点这几个知识库。为了满足广大用户对其他知识库需求,同时增强开源性,现在教学如何自己开发第三方知识库。
|
||||
目前,互联网上拥有各种各样的文档库,例如飞书,语雀等等。 FastGPT 的不同用户可能使用的文档库不同,目前 FastGPT 内置了飞书、语雀文档库,如果需要接入其他文档库,可以参考本节内容。
|
||||
|
||||
## 准备本地开发环境
|
||||
|
||||
想要开发 FastGPT ,首先要拥有本地开发环境,具体参考[快速开始本地开发](../../development/intro.md)
|
||||
## 统一的接口规范
|
||||
|
||||
## 开始开发
|
||||
为了实现对不同文档库的统一接入,FastGPT 对第三方文档库进行了接口的规范,共包含 4 个接口内容,可以[查看 API 文件库接口](/docs/guide/knowledge_base/api_datase)。
|
||||
|
||||
所有内置的文档库,都是基于标准的 API 文件库进行扩展。可以参考`FastGPT/packages/service/core/dataset/apiDataset/yuqueDataset/api.ts`中的代码,进行其他文档库的扩展。一共需要完成 4 个接口开发:
|
||||
|
||||
1. 获取文件列表
|
||||
2. 获取文件内容/文件链接
|
||||
3. 获取原文预览地址
|
||||
4. 获取文件详情信息
|
||||
|
||||
## 开始一个第三方文件库
|
||||
|
||||
为了方便讲解,这里以添加飞书知识库为例。
|
||||
|
||||
首先,要进入 FastGPT 项目路径下的`FastGPT\packages\global\core\dataset\apiDataset.d.ts`文件,添加自己的知识库 Server 类型。
|
||||
### 1. 添加第三方文档库参数
|
||||
|
||||
{{% alert icon="🤖 " context="success" %}}
|
||||
知识库类型的字段设计是依赖于自己的知识库需要什么字段进行后续的api调用。
|
||||
如果知识库有`根目录`选择的功能,需要设置添加一个字段`basePath`。[点击查看`根目录`功能](/docs/guide/knowledge_base/third_dataset/#添加配置表单)
|
||||
{{% /alert %}}
|
||||
首先,要进入 FastGPT 项目路径下的`FastGPT\packages\global\core\dataset\apiDataset.d.ts`文件,添加第三方文档库 Server 类型。例如,语雀文档中,需要提供`userId`、`token`两个字段作为鉴权信息。
|
||||
|
||||

|
||||
|
||||
然后需要在 FastGPT 项目路径`FastGPT\packages\service\core\dataset\apiDataset\`下创建一个需要添加的文件夹,这里是`feishuKownledgeDataset`,在添加的文件夹下创建一个`api.ts`,如图:
|
||||
|
||||

|
||||
|
||||
## `api.ts`文件内容
|
||||
|
||||
首先,需要完成一些导入操作,例如
|
||||
|
||||
```TS
|
||||
import type {
|
||||
APIFileItem,
|
||||
ApiFileReadContentResponse,
|
||||
ApiDatasetDetailResponse,
|
||||
FeishuKnowledgeServer //这里是之前添加的知识库类型Server
|
||||
} from '@fastgpt/global/core/dataset/apiDataset';
|
||||
import { type ParentIdType } from '@fastgpt/global/common/parentFolder/type';
|
||||
import axios, { type Method } from 'axios';
|
||||
import { addLog } from '../../../common/system/log';
|
||||
```
|
||||
|
||||
之后定义一些返回体,需要根据自己要调用的 api 接口的返回类型进行设计。这里例如:
|
||||
```TS
|
||||
type ResponseDataType = {
|
||||
success: boolean;
|
||||
message: string;
|
||||
data: any;
|
||||
};
|
||||
|
||||
/**
|
||||
* Request
|
||||
*/
|
||||
type FeishuFileListResponse = {
|
||||
items: {
|
||||
title: string;
|
||||
creator: string;
|
||||
has_child: boolean;
|
||||
parent_node_token: string;
|
||||
owner_id: string;
|
||||
space_id: string;
|
||||
node_token: string;
|
||||
node_type: string;
|
||||
node_create_time: number;
|
||||
obj_edit_time: number;
|
||||
obj_create_time: number;
|
||||
obj_token: string;
|
||||
obj_type: string;
|
||||
origin_node_token: string;
|
||||
origin_space_id: string;
|
||||
}[];
|
||||
has_more: boolean;
|
||||
next_page_token: string;
|
||||
```ts
|
||||
export type YuqueServer = {
|
||||
userId: string;
|
||||
token?: string;
|
||||
basePath?: string;
|
||||
};
|
||||
```
|
||||
|
||||
需要先设计设计一个函数,函数名以`知识库类型+Request`为例,例如:
|
||||
|
||||
```TS
|
||||
export const useFeishuKnowledgeDatasetRequest = ({
|
||||
feishuKnowledgeServer
|
||||
}: {
|
||||
feishuKnowledgeServer: FeishuKnowledgeServer;
|
||||
}) => {}
|
||||
```
|
||||
|
||||
函数定义完成后,需要完成 api 方法的设计,需要以下四个方法:
|
||||
|
||||
{{% alert icon="🤖 " context="success" %}}
|
||||
方法的具体设计,可以参考`projects\app\src\service\core\dataset\`下的任何一个知识库的`api.ts`文件,知识库文件夹以`dataset`结尾
|
||||
如果文档库有`根目录`选择的功能,需要设置添加一个字段`basePath`
|
||||
{{% /alert %}}
|
||||
|
||||
| 方法名 | 返回体 | 说明 |
|
||||
| --- | --- | --- |
|
||||
| listFiles | id,parentId,name,type,hasChild,updateTime,createTime | 用于获取知识库的文件列表 |
|
||||
| getFileContent | title,rawText | 用于获取知识库文件内容 |
|
||||
| getFileDetail | name,parentId,id | 用于获取知识库文件详细信息 |
|
||||
| getFilePreviewUrl | '网址' | 用于获取知识库文件原始页面 |
|
||||
### 2. 创建 Hook 文件
|
||||
|
||||
在设计好`api.ts`文件后,需要在`projects\app\src\service\core\dataset\apidataset\index.ts`里,添加之前写好的函数,例如:
|
||||
每个第三方文档库都会采用 Hook 的方式来实现一套 API 接口的维护,Hook 里包含 4 个函数需要完成。
|
||||
|
||||

|
||||
- 在`FastGPT\packages\service\core\dataset\apiDataset\`下创建一个文档库的文件夹,然后在文件夹下创建一个`api.ts`文件
|
||||
- 在`api.ts`文件中,需要完成 4 个函数的定义,分别是:
|
||||
- `listFiles`:获取文件列表
|
||||
- `getFileContent`:获取文件内容/文件链接
|
||||
- `getFileDetail`:获取文件详情信息
|
||||
- `getFilePreviewUrl`:获取原文预览地址
|
||||
|
||||
在完成了这些之后,现在,我们需要一些方法的支持。在`index.ts`文件里,查找函数`getApiDatasetRequest`的引用,如图:
|
||||
### 3. 数据库添加配置字段
|
||||
|
||||

|
||||
|
||||
{{% alert icon="🤖 " context="warning" %}}
|
||||
其中`getCatalog.ts`和`getPathNames.ts`文件是对根路径设置的支持,如果你的知识库不支持根路径设置,可以设置返回空。[点击查看`根目录`功能](/docs/guide/knowledge_base/third_dataset/#添加配置表单)如图:
|
||||
|
||||

|
||||
|
||||
{{% /alert %}}
|
||||
|
||||
可以看到有一些文件引用这个函数,这些就是知识库的方法,现在我们需要进入这些文件添加我们的知识库类型。以`list.ts`为例,如图添加:
|
||||
|
||||

|
||||
|
||||
{{% alert icon="🤖 " context="success" %}}
|
||||
方法的具体添加,可以参考文件内的其他知识库。
|
||||
{{% /alert %}}
|
||||
|
||||
在`FastGPT\projects\app\src\pages\api\core\dataset\detail.ts`文件中,添加如下内容。
|
||||
|
||||

|
||||
|
||||
在`FastGPT\projects\app\src\pages\api\core\dataset\update.ts`文件中,添加如下内容。
|
||||
|
||||
{{% alert icon="🤖 " context="warning" %}}
|
||||
该文件主要是负责更新知识库配置的,如果不添加,会导致无法正常更新配置。
|
||||
{{% /alert %}}
|
||||
|
||||

|
||||
|
||||
|
||||
|
||||
## 数据库类型添加
|
||||
|
||||
添加新的知识库,需要在`packages/service/core/dataset/schema.ts` 中添加自己的知识库类型,如图:
|
||||
- 在`packages/service/core/dataset/schema.ts` 中添加第三方文档库的配置字段,类型统一设置成`Object`。
|
||||
- 在`FastGPT/packages/global/core/dataset/type.d.ts`中添加第三方文档库配置字段的数据类型,类型设置为第一步创建的参数。
|
||||
|
||||

|
||||
|
||||
@ -150,10 +63,9 @@ export const useFeishuKnowledgeDatasetRequest = ({
|
||||
`schema.ts`文件修改后,需要重新启动 FastGPT 项目才会生效。
|
||||
{{% /alert %}}
|
||||
|
||||
### 4. 添加知识库类型
|
||||
|
||||
## 添加知识库类型
|
||||
|
||||
添加完这些之后,需要添加知识库类型,需要在`projects/app/src/web/core/dataset/constants.ts`中,添加自己的知识库类型
|
||||
在`projects/app/src/web/core/dataset/constants.ts`中,添加自己的知识库类型
|
||||
|
||||
```TS
|
||||
export const datasetTypeCourseMap: Record<`${DatasetTypeEnum}`, string> = {
|
||||
|
||||
6
packages/global/core/dataset/api.d.ts
vendored
@ -17,6 +17,9 @@ import type { ParentIdType } from '../../common/parentFolder/type';
|
||||
/* ================= dataset ===================== */
|
||||
export type DatasetUpdateBody = {
|
||||
id: string;
|
||||
|
||||
apiDatasetServer?: DatasetSchemaType['apiDatasetServer'];
|
||||
|
||||
parentId?: ParentIdType;
|
||||
name?: string;
|
||||
avatar?: string;
|
||||
@ -28,9 +31,6 @@ export type DatasetUpdateBody = {
|
||||
websiteConfig?: DatasetSchemaType['websiteConfig'];
|
||||
externalReadUrl?: DatasetSchemaType['externalReadUrl'];
|
||||
defaultPermission?: DatasetSchemaType['defaultPermission'];
|
||||
apiServer?: DatasetSchemaType['apiServer'];
|
||||
yuqueServer?: DatasetSchemaType['yuqueServer'];
|
||||
feishuServer?: DatasetSchemaType['feishuServer'];
|
||||
chunkSettings?: DatasetSchemaType['chunkSettings'];
|
||||
|
||||
// sync schedule
|
||||
|
||||
@ -1,5 +1,5 @@
|
||||
import { RequireOnlyOne } from '../../common/type/utils';
|
||||
import type { ParentIdType } from '../../common/parentFolder/type.d';
|
||||
import { RequireOnlyOne } from '../../../common/type/utils';
|
||||
import type { ParentIdType } from '../../../common/parentFolder/type';
|
||||
|
||||
export type APIFileItem = {
|
||||
id: string;
|
||||
@ -28,6 +28,12 @@ export type YuqueServer = {
|
||||
basePath?: string;
|
||||
};
|
||||
|
||||
export type ApiDatasetServerType = {
|
||||
apiServer?: APIFileServer;
|
||||
feishuServer?: FeishuServer;
|
||||
yuqueServer?: YuqueServer;
|
||||
};
|
||||
|
||||
// Api dataset api
|
||||
|
||||
export type APIFileListResponse = APIFileItem[];
|
||||
31
packages/global/core/dataset/apiDataset/utils.ts
Normal file
@ -0,0 +1,31 @@
|
||||
import type { ApiDatasetServerType } from './type';
|
||||
|
||||
export const filterApiDatasetServerPublicData = (apiDatasetServer?: ApiDatasetServerType) => {
|
||||
if (!apiDatasetServer) return undefined;
|
||||
|
||||
const { apiServer, yuqueServer, feishuServer } = apiDatasetServer;
|
||||
|
||||
return {
|
||||
apiServer: apiServer
|
||||
? {
|
||||
baseUrl: apiServer.baseUrl,
|
||||
authorization: '',
|
||||
basePath: apiServer.basePath
|
||||
}
|
||||
: undefined,
|
||||
yuqueServer: yuqueServer
|
||||
? {
|
||||
userId: yuqueServer.userId,
|
||||
token: '',
|
||||
basePath: yuqueServer.basePath
|
||||
}
|
||||
: undefined,
|
||||
feishuServer: feishuServer
|
||||
? {
|
||||
appId: feishuServer.appId,
|
||||
appSecret: '',
|
||||
folderToken: feishuServer.folderToken
|
||||
}
|
||||
: undefined
|
||||
};
|
||||
};
|
||||
@ -6,11 +6,51 @@ export enum DatasetTypeEnum {
|
||||
dataset = 'dataset',
|
||||
websiteDataset = 'websiteDataset', // depp link
|
||||
externalFile = 'externalFile',
|
||||
|
||||
apiDataset = 'apiDataset',
|
||||
feishu = 'feishu',
|
||||
yuque = 'yuque'
|
||||
}
|
||||
export const DatasetTypeMap = {
|
||||
|
||||
// @ts-ignore
|
||||
export const ApiDatasetTypeMap: Record<
|
||||
`${DatasetTypeEnum}`,
|
||||
{
|
||||
icon: string;
|
||||
label: any;
|
||||
collectionLabel: string;
|
||||
courseUrl?: string;
|
||||
}
|
||||
> = {
|
||||
[DatasetTypeEnum.apiDataset]: {
|
||||
icon: 'core/dataset/externalDatasetOutline',
|
||||
label: i18nT('dataset:api_file'),
|
||||
collectionLabel: i18nT('common:File'),
|
||||
courseUrl: '/docs/guide/knowledge_base/api_dataset/'
|
||||
},
|
||||
[DatasetTypeEnum.feishu]: {
|
||||
icon: 'core/dataset/feishuDatasetOutline',
|
||||
label: i18nT('dataset:feishu_dataset'),
|
||||
collectionLabel: i18nT('common:File'),
|
||||
courseUrl: '/docs/guide/knowledge_base/lark_dataset/'
|
||||
},
|
||||
[DatasetTypeEnum.yuque]: {
|
||||
icon: 'core/dataset/yuqueDatasetOutline',
|
||||
label: i18nT('dataset:yuque_dataset'),
|
||||
collectionLabel: i18nT('common:File'),
|
||||
courseUrl: '/docs/guide/knowledge_base/yuque_dataset/'
|
||||
}
|
||||
};
|
||||
export const DatasetTypeMap: Record<
|
||||
`${DatasetTypeEnum}`,
|
||||
{
|
||||
icon: string;
|
||||
label: any;
|
||||
collectionLabel: string;
|
||||
courseUrl?: string;
|
||||
}
|
||||
> = {
|
||||
...ApiDatasetTypeMap,
|
||||
[DatasetTypeEnum.folder]: {
|
||||
icon: 'common/folderFill',
|
||||
label: i18nT('dataset:folder_dataset'),
|
||||
@ -24,27 +64,13 @@ export const DatasetTypeMap = {
|
||||
[DatasetTypeEnum.websiteDataset]: {
|
||||
icon: 'core/dataset/websiteDatasetOutline',
|
||||
label: i18nT('dataset:website_dataset'),
|
||||
collectionLabel: i18nT('common:Website')
|
||||
collectionLabel: i18nT('common:Website'),
|
||||
courseUrl: '/docs/guide/knowledge_base/websync/'
|
||||
},
|
||||
[DatasetTypeEnum.externalFile]: {
|
||||
icon: 'core/dataset/externalDatasetOutline',
|
||||
label: i18nT('dataset:external_file'),
|
||||
collectionLabel: i18nT('common:File')
|
||||
},
|
||||
[DatasetTypeEnum.apiDataset]: {
|
||||
icon: 'core/dataset/externalDatasetOutline',
|
||||
label: i18nT('dataset:api_file'),
|
||||
collectionLabel: i18nT('common:File')
|
||||
},
|
||||
[DatasetTypeEnum.feishu]: {
|
||||
icon: 'core/dataset/feishuDatasetOutline',
|
||||
label: i18nT('dataset:feishu_dataset'),
|
||||
collectionLabel: i18nT('common:File')
|
||||
},
|
||||
[DatasetTypeEnum.yuque]: {
|
||||
icon: 'core/dataset/yuqueDatasetOutline',
|
||||
label: i18nT('dataset:yuque_dataset'),
|
||||
collectionLabel: i18nT('common:File')
|
||||
}
|
||||
};
|
||||
|
||||
|
||||
15
packages/global/core/dataset/type.d.ts
vendored
@ -13,7 +13,12 @@ import type {
|
||||
ChunkTriggerConfigTypeEnum
|
||||
} from './constants';
|
||||
import type { DatasetPermission } from '../../support/permission/dataset/controller';
|
||||
import type { APIFileServer, FeishuServer, YuqueServer } from './apiDataset';
|
||||
import type {
|
||||
ApiDatasetServerType,
|
||||
APIFileServer,
|
||||
FeishuServer,
|
||||
YuqueServer
|
||||
} from './apiDataset/type';
|
||||
import type { SourceMemberType } from 'support/user/type';
|
||||
import type { DatasetDataIndexTypeEnum } from './data/constants';
|
||||
import type { ParentIdType } from 'common/parentFolder/type';
|
||||
@ -73,14 +78,16 @@ export type DatasetSchemaType = {
|
||||
chunkSettings?: ChunkSettingsType;
|
||||
|
||||
inheritPermission: boolean;
|
||||
apiServer?: APIFileServer;
|
||||
feishuServer?: FeishuServer;
|
||||
yuqueServer?: YuqueServer;
|
||||
|
||||
apiDatasetServer?: ApiDatasetServerType;
|
||||
|
||||
// abandon
|
||||
autoSync?: boolean;
|
||||
externalReadUrl?: string;
|
||||
defaultPermission?: number;
|
||||
apiServer?: APIFileServer;
|
||||
feishuServer?: FeishuServer;
|
||||
yuqueServer?: YuqueServer;
|
||||
};
|
||||
|
||||
export type DatasetCollectionSchemaType = ChunkSettingsType & {
|
||||
|
||||
7
packages/service/common/api/type.d.ts
vendored
@ -1,5 +1,8 @@
|
||||
import type { ApiDatasetDetailResponse } from '@fastgpt/global/core/dataset/apiDataset';
|
||||
import { FeishuServer, YuqueServer } from '@fastgpt/global/core/dataset/apiDataset';
|
||||
import type {
|
||||
ApiDatasetDetailResponse,
|
||||
FeishuServer,
|
||||
YuqueServer
|
||||
} from '@fastgpt/global/core/dataset/apiDataset/type';
|
||||
import type {
|
||||
DeepRagSearchProps,
|
||||
SearchDatasetDataResponse
|
||||
|
||||
@ -3,12 +3,11 @@ import type {
|
||||
ApiFileReadContentResponse,
|
||||
APIFileReadResponse,
|
||||
ApiDatasetDetailResponse,
|
||||
APIFileServer,
|
||||
APIFileItem
|
||||
} from '@fastgpt/global/core/dataset/apiDataset';
|
||||
APIFileServer
|
||||
} from '@fastgpt/global/core/dataset/apiDataset/type';
|
||||
import axios, { type Method } from 'axios';
|
||||
import { addLog } from '../../../common/system/log';
|
||||
import { readFileRawTextByUrl } from '../read';
|
||||
import { addLog } from '../../../../common/system/log';
|
||||
import { readFileRawTextByUrl } from '../../read';
|
||||
import { type ParentIdType } from '@fastgpt/global/common/parentFolder/type';
|
||||
import { type RequireOnlyOne } from '@fastgpt/global/common/type/utils';
|
||||
|
||||
@ -3,10 +3,10 @@ import type {
|
||||
ApiFileReadContentResponse,
|
||||
ApiDatasetDetailResponse,
|
||||
FeishuServer
|
||||
} from '@fastgpt/global/core/dataset/apiDataset';
|
||||
} from '@fastgpt/global/core/dataset/apiDataset/type';
|
||||
import { type ParentIdType } from '@fastgpt/global/common/parentFolder/type';
|
||||
import axios, { type Method } from 'axios';
|
||||
import { addLog } from '../../../common/system/log';
|
||||
import { addLog } from '../../../../common/system/log';
|
||||
|
||||
type ResponseDataType = {
|
||||
success: boolean;
|
||||
@ -1,18 +1,10 @@
|
||||
import type {
|
||||
APIFileServer,
|
||||
YuqueServer,
|
||||
FeishuServer
|
||||
} from '@fastgpt/global/core/dataset/apiDataset';
|
||||
import { useApiDatasetRequest } from './api';
|
||||
import { useYuqueDatasetRequest } from '../yuqueDataset/api';
|
||||
import { useFeishuDatasetRequest } from '../feishuDataset/api';
|
||||
import { useApiDatasetRequest } from './custom/api';
|
||||
import { useYuqueDatasetRequest } from './yuqueDataset/api';
|
||||
import { useFeishuDatasetRequest } from './feishuDataset/api';
|
||||
import type { ApiDatasetServerType } from '@fastgpt/global/core/dataset/apiDataset/type';
|
||||
|
||||
export const getApiDatasetRequest = async (data: {
|
||||
apiServer?: APIFileServer;
|
||||
yuqueServer?: YuqueServer;
|
||||
feishuServer?: FeishuServer;
|
||||
}) => {
|
||||
const { apiServer, yuqueServer, feishuServer } = data;
|
||||
export const getApiDatasetRequest = async (apiDatasetServer?: ApiDatasetServerType) => {
|
||||
const { apiServer, yuqueServer, feishuServer } = apiDatasetServer || {};
|
||||
|
||||
if (apiServer) {
|
||||
return useApiDatasetRequest({ apiServer });
|
||||
|
||||
@ -3,9 +3,9 @@ import type {
|
||||
ApiFileReadContentResponse,
|
||||
YuqueServer,
|
||||
ApiDatasetDetailResponse
|
||||
} from '@fastgpt/global/core/dataset/apiDataset';
|
||||
} from '@fastgpt/global/core/dataset/apiDataset/type';
|
||||
import axios, { type Method } from 'axios';
|
||||
import { addLog } from '../../../common/system/log';
|
||||
import { addLog } from '../../../../common/system/log';
|
||||
import { type ParentIdType } from '@fastgpt/global/common/parentFolder/type';
|
||||
|
||||
type ResponseDataType = {
|
||||
@ -105,7 +105,6 @@ export const useYuqueDatasetRequest = ({ yuqueServer }: { yuqueServer: YuqueServ
|
||||
if (!parentId) {
|
||||
if (yuqueServer.basePath) parentId = yuqueServer.basePath;
|
||||
}
|
||||
|
||||
let files: APIFileItem[] = [];
|
||||
|
||||
if (!parentId) {
|
||||
@ -157,9 +157,7 @@ export const syncCollection = async (collection: CollectionWithDatasetType) => {
|
||||
return {
|
||||
type: DatasetSourceReadTypeEnum.apiFile,
|
||||
sourceId,
|
||||
apiServer: dataset.apiServer,
|
||||
feishuServer: dataset.feishuServer,
|
||||
yuqueServer: dataset.yuqueServer
|
||||
apiDatasetServer: dataset.apiDatasetServer
|
||||
};
|
||||
})();
|
||||
|
||||
|
||||
@ -9,13 +9,9 @@ import { type TextSplitProps, splitText2Chunks } from '@fastgpt/global/common/st
|
||||
import axios from 'axios';
|
||||
import { readRawContentByFileBuffer } from '../../common/file/read/utils';
|
||||
import { parseFileExtensionFromUrl } from '@fastgpt/global/common/string/tools';
|
||||
import {
|
||||
type APIFileServer,
|
||||
type FeishuServer,
|
||||
type YuqueServer
|
||||
} from '@fastgpt/global/core/dataset/apiDataset';
|
||||
import { getApiDatasetRequest } from './apiDataset';
|
||||
import Papa from 'papaparse';
|
||||
import type { ApiDatasetServerType } from '@fastgpt/global/core/dataset/apiDataset/type';
|
||||
|
||||
export const readFileRawTextByUrl = async ({
|
||||
teamId,
|
||||
@ -69,9 +65,7 @@ export const readDatasetSourceRawText = async ({
|
||||
sourceId,
|
||||
selector,
|
||||
externalFileId,
|
||||
apiServer,
|
||||
feishuServer,
|
||||
yuqueServer,
|
||||
apiDatasetServer,
|
||||
customPdfParse,
|
||||
getFormatText
|
||||
}: {
|
||||
@ -84,9 +78,7 @@ export const readDatasetSourceRawText = async ({
|
||||
|
||||
selector?: string; // link selector
|
||||
externalFileId?: string; // external file dataset
|
||||
apiServer?: APIFileServer; // api dataset
|
||||
feishuServer?: FeishuServer; // feishu dataset
|
||||
yuqueServer?: YuqueServer; // yuque dataset
|
||||
apiDatasetServer?: ApiDatasetServerType; // api dataset
|
||||
}): Promise<{
|
||||
title?: string;
|
||||
rawText: string;
|
||||
@ -128,9 +120,7 @@ export const readDatasetSourceRawText = async ({
|
||||
};
|
||||
} else if (type === DatasetSourceReadTypeEnum.apiFile) {
|
||||
const { title, rawText } = await readApiServerFileContent({
|
||||
apiServer,
|
||||
feishuServer,
|
||||
yuqueServer,
|
||||
apiDatasetServer,
|
||||
apiFileId: sourceId,
|
||||
teamId,
|
||||
tmbId
|
||||
@ -147,17 +137,13 @@ export const readDatasetSourceRawText = async ({
|
||||
};
|
||||
|
||||
export const readApiServerFileContent = async ({
|
||||
apiServer,
|
||||
feishuServer,
|
||||
yuqueServer,
|
||||
apiDatasetServer,
|
||||
apiFileId,
|
||||
teamId,
|
||||
tmbId,
|
||||
customPdfParse
|
||||
}: {
|
||||
apiServer?: APIFileServer;
|
||||
feishuServer?: FeishuServer;
|
||||
yuqueServer?: YuqueServer;
|
||||
apiDatasetServer?: ApiDatasetServerType;
|
||||
apiFileId: string;
|
||||
teamId: string;
|
||||
tmbId: string;
|
||||
@ -166,13 +152,7 @@ export const readApiServerFileContent = async ({
|
||||
title?: string;
|
||||
rawText: string;
|
||||
}> => {
|
||||
return (
|
||||
await getApiDatasetRequest({
|
||||
apiServer,
|
||||
yuqueServer,
|
||||
feishuServer
|
||||
})
|
||||
).getFileContent({
|
||||
return (await getApiDatasetRequest(apiDatasetServer)).getFileContent({
|
||||
teamId,
|
||||
tmbId,
|
||||
apiFileId,
|
||||
|
||||
@ -127,14 +127,16 @@ const DatasetSchema = new Schema({
|
||||
type: Boolean,
|
||||
default: true
|
||||
},
|
||||
apiServer: Object,
|
||||
feishuServer: Object,
|
||||
yuqueServer: Object,
|
||||
|
||||
apiDatasetServer: Object,
|
||||
|
||||
// abandoned
|
||||
autoSync: Boolean,
|
||||
externalReadUrl: String,
|
||||
defaultPermission: Number
|
||||
defaultPermission: Number,
|
||||
apiServer: Object,
|
||||
feishuServer: Object,
|
||||
yuqueServer: Object
|
||||
});
|
||||
|
||||
try {
|
||||
|
||||
@ -6,7 +6,7 @@ import type {
|
||||
APIFileServer,
|
||||
FeishuServer,
|
||||
YuqueServer
|
||||
} from '@fastgpt/global/core/dataset/apiDataset';
|
||||
} from '@fastgpt/global/core/dataset/apiDataset/type';
|
||||
import type {
|
||||
DatasetSearchModeEnum,
|
||||
DatasetTypeEnum
|
||||
@ -17,6 +17,7 @@ import {
|
||||
TrainingModeEnum
|
||||
} from '@fastgpt/global/core/dataset/constants';
|
||||
import type { SearchDataResponseItemType } from '@fastgpt/global/core/dataset/type';
|
||||
import type { ApiDatasetServerType } from '@fastgpt/global/core/dataset/apiDataset/type';
|
||||
import { DatasetDataIndexItemType } from '@fastgpt/global/core/dataset/type';
|
||||
import type { NodeInputKeyEnum } from '@fastgpt/global/core/workflow/constants';
|
||||
import { PermissionValueType } from '@fastgpt/global/support/permission/type';
|
||||
@ -31,9 +32,7 @@ export type CreateDatasetParams = {
|
||||
vectorModel?: string;
|
||||
agentModel?: string;
|
||||
vlmModel?: string;
|
||||
apiServer?: APIFileServer;
|
||||
feishuServer?: FeishuServer;
|
||||
yuqueServer?: YuqueServer;
|
||||
apiDatasetServer?: ApiDatasetServerType;
|
||||
};
|
||||
|
||||
export type RebuildEmbeddingProps = {
|
||||
|
||||
@ -3,11 +3,6 @@ import { DatasetTypeEnum } from '@fastgpt/global/core/dataset/constants';
|
||||
import { Flex, Input, Button, ModalBody, ModalFooter, Box } from '@chakra-ui/react';
|
||||
import type { UseFormReturn } from 'react-hook-form';
|
||||
import { useTranslation } from 'next-i18next';
|
||||
import type {
|
||||
APIFileServer,
|
||||
FeishuServer,
|
||||
YuqueServer
|
||||
} from '@fastgpt/global/core/dataset/apiDataset';
|
||||
import { getApiDatasetPaths, getApiDatasetCatalog } from '@/web/core/dataset/api';
|
||||
import type {
|
||||
GetResourceFolderListItemResponse,
|
||||
@ -22,6 +17,7 @@ import FormLabel from '@fastgpt/web/components/common/MyBox/FormLabel';
|
||||
import MyModal from '@fastgpt/web/components/common/MyModal';
|
||||
import MyIcon from '@fastgpt/web/components/common/Icon';
|
||||
import { FolderIcon } from '@fastgpt/global/common/file/image/constants';
|
||||
import type { ApiDatasetServerType } from '@fastgpt/global/core/dataset/apiDataset/type';
|
||||
|
||||
const ApiDatasetForm = ({
|
||||
type,
|
||||
@ -32,9 +28,7 @@ const ApiDatasetForm = ({
|
||||
datasetId?: string;
|
||||
form: UseFormReturn<
|
||||
{
|
||||
apiServer?: APIFileServer;
|
||||
feishuServer?: FeishuServer;
|
||||
yuqueServer?: YuqueServer;
|
||||
apiDatasetServer?: ApiDatasetServerType;
|
||||
},
|
||||
any
|
||||
>;
|
||||
@ -42,9 +36,10 @@ const ApiDatasetForm = ({
|
||||
const { t } = useTranslation();
|
||||
const { register, setValue, watch } = form;
|
||||
|
||||
const yuqueServer = watch('yuqueServer');
|
||||
const feishuServer = watch('feishuServer');
|
||||
const apiServer = watch('apiServer');
|
||||
const apiDatasetServer = watch('apiDatasetServer');
|
||||
const yuqueServer = apiDatasetServer?.yuqueServer;
|
||||
const feishuServer = apiDatasetServer?.feishuServer;
|
||||
const apiServer = apiDatasetServer?.apiServer;
|
||||
|
||||
const [pathNames, setPathNames] = useState(t('dataset:rootdirectory'));
|
||||
const [
|
||||
@ -91,9 +86,7 @@ const ApiDatasetForm = ({
|
||||
const path = await getApiDatasetPaths({
|
||||
datasetId,
|
||||
parentId,
|
||||
yuqueServer,
|
||||
feishuServer,
|
||||
apiServer
|
||||
apiDatasetServer
|
||||
});
|
||||
setPathNames(path);
|
||||
},
|
||||
@ -108,13 +101,13 @@ const ApiDatasetForm = ({
|
||||
const value = id === 'root' || id === null || id === undefined ? '' : id;
|
||||
switch (type) {
|
||||
case DatasetTypeEnum.yuque:
|
||||
setValue('yuqueServer.basePath', value);
|
||||
setValue('apiDatasetServer.yuqueServer.basePath', value);
|
||||
break;
|
||||
case DatasetTypeEnum.feishu:
|
||||
setValue('feishuServer.folderToken', value);
|
||||
setValue('apiDatasetServer.feishuServer.folderToken', value);
|
||||
break;
|
||||
case DatasetTypeEnum.apiDataset:
|
||||
setValue('apiServer.basePath', value);
|
||||
setValue('apiDatasetServer.apiServer.basePath', value);
|
||||
break;
|
||||
}
|
||||
|
||||
@ -147,32 +140,10 @@ const ApiDatasetForm = ({
|
||||
<BaseUrlSelector
|
||||
selectId={yuqueServer?.basePath || apiServer?.basePath || 'root'}
|
||||
server={async (e: GetResourceFolderListProps) => {
|
||||
const params: GetApiDatasetCataLogProps = { parentId: e.parentId };
|
||||
|
||||
switch (type) {
|
||||
case DatasetTypeEnum.yuque:
|
||||
params.yuqueServer = {
|
||||
userId: yuqueServer?.userId || '',
|
||||
token: yuqueServer?.token || '',
|
||||
basePath: ''
|
||||
};
|
||||
break;
|
||||
// Currently, only Yuque is using it
|
||||
case DatasetTypeEnum.feishu:
|
||||
params.feishuServer = {
|
||||
appId: feishuServer?.appId || '',
|
||||
appSecret: feishuServer?.appSecret || '',
|
||||
folderToken: feishuServer?.folderToken || ''
|
||||
};
|
||||
break;
|
||||
case DatasetTypeEnum.apiDataset:
|
||||
params.apiServer = {
|
||||
baseUrl: apiServer?.baseUrl || '',
|
||||
authorization: apiServer?.authorization || '',
|
||||
basePath: ''
|
||||
};
|
||||
break;
|
||||
}
|
||||
const params: GetApiDatasetCataLogProps = {
|
||||
parentId: e.parentId,
|
||||
apiDatasetServer
|
||||
};
|
||||
|
||||
return getApiDatasetCatalog(params);
|
||||
}}
|
||||
@ -193,7 +164,7 @@ const ApiDatasetForm = ({
|
||||
bg={'myWhite.600'}
|
||||
placeholder={t('dataset:api_url')}
|
||||
maxLength={200}
|
||||
{...register('apiServer.baseUrl', { required: true })}
|
||||
{...register('apiDatasetServer.apiServer.baseUrl', { required: true })}
|
||||
/>
|
||||
</Flex>
|
||||
<Flex mt={6} alignItems={'center'}>
|
||||
@ -204,7 +175,7 @@ const ApiDatasetForm = ({
|
||||
bg={'myWhite.600'}
|
||||
placeholder={t('dataset:request_headers')}
|
||||
maxLength={2000}
|
||||
{...register('apiServer.authorization')}
|
||||
{...register('apiDatasetServer.apiServer.authorization')}
|
||||
/>
|
||||
</Flex>
|
||||
{renderBaseUrlSelector()}
|
||||
@ -227,7 +198,7 @@ const ApiDatasetForm = ({
|
||||
bg={'myWhite.600'}
|
||||
placeholder={'App ID'}
|
||||
maxLength={200}
|
||||
{...register('feishuServer.appId', { required: true })}
|
||||
{...register('apiDatasetServer.feishuServer.appId', { required: true })}
|
||||
/>
|
||||
</Flex>
|
||||
<Flex mt={6}>
|
||||
@ -244,7 +215,7 @@ const ApiDatasetForm = ({
|
||||
bg={'myWhite.600'}
|
||||
placeholder={'App Secret'}
|
||||
maxLength={200}
|
||||
{...register('feishuServer.appSecret', { required: true })}
|
||||
{...register('apiDatasetServer.feishuServer.appSecret', { required: true })}
|
||||
/>
|
||||
</Flex>
|
||||
<Flex mt={6}>
|
||||
@ -261,7 +232,7 @@ const ApiDatasetForm = ({
|
||||
bg={'myWhite.600'}
|
||||
placeholder={'Folder Token'}
|
||||
maxLength={200}
|
||||
{...register('feishuServer.folderToken', { required: true })}
|
||||
{...register('apiDatasetServer.feishuServer.folderToken', { required: true })}
|
||||
/>
|
||||
</Flex>
|
||||
{/* {renderBaseUrlSelector()}
|
||||
@ -278,7 +249,7 @@ const ApiDatasetForm = ({
|
||||
bg={'myWhite.600'}
|
||||
placeholder={'User ID'}
|
||||
maxLength={200}
|
||||
{...register('yuqueServer.userId', { required: true })}
|
||||
{...register('apiDatasetServer.yuqueServer.userId', { required: true })}
|
||||
/>
|
||||
</Flex>
|
||||
<Flex mt={6} alignItems={'center'}>
|
||||
@ -289,7 +260,7 @@ const ApiDatasetForm = ({
|
||||
bg={'myWhite.600'}
|
||||
placeholder={'Token'}
|
||||
maxLength={200}
|
||||
{...register('yuqueServer.token', { required: true })}
|
||||
{...register('apiDatasetServer.yuqueServer.token', { required: true })}
|
||||
/>
|
||||
</Flex>
|
||||
{renderBaseUrlSelector()}
|
||||
|
||||
@ -17,7 +17,8 @@ import {
|
||||
DatasetCollectionTypeEnum,
|
||||
DatasetTypeEnum,
|
||||
DatasetTypeMap,
|
||||
DatasetStatusEnum
|
||||
DatasetStatusEnum,
|
||||
ApiDatasetTypeMap
|
||||
} from '@fastgpt/global/core/dataset/constants';
|
||||
import EditFolderModal, { useEditFolder } from '../../EditFolderModal';
|
||||
import { TabEnum } from '../../../../pages/dataset/detail/index';
|
||||
@ -435,9 +436,7 @@ const Header = ({ hasTrainingData }: { hasTrainingData: boolean }) => {
|
||||
/>
|
||||
)}
|
||||
{/* apiDataset */}
|
||||
{(datasetDetail?.type === DatasetTypeEnum.apiDataset ||
|
||||
datasetDetail?.type === DatasetTypeEnum.feishu ||
|
||||
datasetDetail?.type === DatasetTypeEnum.yuque) && (
|
||||
{datasetDetail?.type && ApiDatasetTypeMap[datasetDetail.type] && (
|
||||
<Flex
|
||||
px={3.5}
|
||||
py={2}
|
||||
|
||||
@ -13,7 +13,7 @@ import { type ParentTreePathItemType } from '@fastgpt/global/common/parentFolder
|
||||
import FolderPath from '@/components/common/folder/Path';
|
||||
import { getSourceNameIcon } from '@fastgpt/global/core/dataset/utils';
|
||||
import MyBox from '@fastgpt/web/components/common/MyBox';
|
||||
import { type APIFileItem } from '@fastgpt/global/core/dataset/apiDataset';
|
||||
import { type APIFileItem } from '@fastgpt/global/core/dataset/apiDataset/type';
|
||||
import SearchInput from '@fastgpt/web/components/common/Input/SearchInput';
|
||||
import { useMount } from 'ahooks';
|
||||
|
||||
|
||||
@ -5,23 +5,17 @@ import { useTranslation } from 'next-i18next';
|
||||
import { useRequest2 } from '@fastgpt/web/hooks/useRequest';
|
||||
import { useForm } from 'react-hook-form';
|
||||
import { useToast } from '@fastgpt/web/hooks/useToast';
|
||||
import {
|
||||
type APIFileServer,
|
||||
type FeishuServer,
|
||||
type YuqueServer
|
||||
} from '@fastgpt/global/core/dataset/apiDataset';
|
||||
import ApiDatasetForm from '@/pageComponents/dataset/ApiDatasetForm';
|
||||
import { useContextSelector } from 'use-context-selector';
|
||||
import { DatasetPageContext } from '@/web/core/dataset/context/datasetPageContext';
|
||||
import { datasetTypeCourseMap } from '@/web/core/dataset/constants';
|
||||
import { getDocPath } from '@/web/common/system/doc';
|
||||
import MyIcon from '@fastgpt/web/components/common/Icon';
|
||||
import type { ApiDatasetServerType } from '@fastgpt/global/core/dataset/apiDataset/type';
|
||||
import { DatasetTypeMap } from '@fastgpt/global/core/dataset/constants';
|
||||
|
||||
export type EditAPIDatasetInfoFormType = {
|
||||
id: string;
|
||||
apiServer?: APIFileServer;
|
||||
yuqueServer?: YuqueServer;
|
||||
feishuServer?: FeishuServer;
|
||||
apiDatasetServer?: ApiDatasetServerType;
|
||||
};
|
||||
|
||||
const EditAPIDatasetInfoModal = ({
|
||||
@ -60,7 +54,7 @@ const EditAPIDatasetInfoModal = ({
|
||||
return (
|
||||
<MyModal isOpen onClose={onClose} w={'450px'} iconSrc="modal/edit" title={title}>
|
||||
<ModalBody>
|
||||
{datasetTypeCourseMap[type] && (
|
||||
{DatasetTypeMap[type]?.courseUrl && (
|
||||
<Flex alignItems={'center'} justifyContent={'space-between'}>
|
||||
<Box color={'myGray.900'} fontSize={'sm'} fontWeight={500}>
|
||||
{t('dataset:apidataset_configuration')}
|
||||
@ -71,7 +65,7 @@ const EditAPIDatasetInfoModal = ({
|
||||
color={'primary.600'}
|
||||
fontSize={'sm'}
|
||||
cursor={'pointer'}
|
||||
onClick={() => window.open(getDocPath(datasetTypeCourseMap[type]), '_blank')}
|
||||
onClick={() => window.open(getDocPath(DatasetTypeMap[type].courseUrl!), '_blank')}
|
||||
>
|
||||
<MyIcon name={'book'} w={4} mr={0.5} />
|
||||
{t('common:Instructions')}
|
||||
|
||||
@ -311,12 +311,12 @@ const Info = ({ datasetId }: { datasetId: string }) => {
|
||||
onClick={() =>
|
||||
setEditedAPIDataset({
|
||||
id: datasetDetail._id,
|
||||
apiServer: datasetDetail.apiServer
|
||||
apiDatasetServer: datasetDetail.apiDatasetServer
|
||||
})
|
||||
}
|
||||
/>
|
||||
</Flex>
|
||||
<Box fontSize={'mini'}>{datasetDetail.apiServer?.baseUrl}</Box>
|
||||
<Box fontSize={'mini'}>{datasetDetail.apiDatasetServer?.apiServer?.baseUrl}</Box>
|
||||
</Box>
|
||||
</>
|
||||
)}
|
||||
@ -336,12 +336,12 @@ const Info = ({ datasetId }: { datasetId: string }) => {
|
||||
onClick={() =>
|
||||
setEditedAPIDataset({
|
||||
id: datasetDetail._id,
|
||||
yuqueServer: datasetDetail.yuqueServer
|
||||
apiDatasetServer: datasetDetail.apiDatasetServer
|
||||
})
|
||||
}
|
||||
/>
|
||||
</Flex>
|
||||
<Box fontSize={'mini'}>{datasetDetail.yuqueServer?.userId}</Box>
|
||||
<Box fontSize={'mini'}>{datasetDetail.apiDatasetServer?.yuqueServer?.userId}</Box>
|
||||
</Box>
|
||||
</>
|
||||
)}
|
||||
@ -361,12 +361,14 @@ const Info = ({ datasetId }: { datasetId: string }) => {
|
||||
onClick={() =>
|
||||
setEditedAPIDataset({
|
||||
id: datasetDetail._id,
|
||||
feishuServer: datasetDetail.feishuServer
|
||||
apiDatasetServer: datasetDetail.apiDatasetServer
|
||||
})
|
||||
}
|
||||
/>
|
||||
</Flex>
|
||||
<Box fontSize={'mini'}>{datasetDetail.feishuServer?.folderToken}</Box>
|
||||
<Box fontSize={'mini'}>
|
||||
{datasetDetail.apiDatasetServer?.feishuServer?.folderToken}
|
||||
</Box>
|
||||
</Box>
|
||||
</>
|
||||
)}
|
||||
@ -435,9 +437,7 @@ const Info = ({ datasetId }: { datasetId: string }) => {
|
||||
onEdit={(data) =>
|
||||
updateDataset({
|
||||
id: datasetId,
|
||||
apiServer: data.apiServer,
|
||||
yuqueServer: data.yuqueServer,
|
||||
feishuServer: data.feishuServer
|
||||
apiDatasetServer: data.apiDatasetServer
|
||||
})
|
||||
}
|
||||
/>
|
||||
|
||||
@ -11,14 +11,13 @@ import MyModal from '@fastgpt/web/components/common/MyModal';
|
||||
import { postCreateDataset } from '@/web/core/dataset/api';
|
||||
import type { CreateDatasetParams } from '@/global/core/dataset/api.d';
|
||||
import { useTranslation } from 'next-i18next';
|
||||
import { DatasetTypeEnum } from '@fastgpt/global/core/dataset/constants';
|
||||
import { DatasetTypeEnum, DatasetTypeMap } from '@fastgpt/global/core/dataset/constants';
|
||||
import AIModelSelector from '@/components/Select/AIModelSelector';
|
||||
import { useSystem } from '@fastgpt/web/hooks/useSystem';
|
||||
import QuestionTip from '@fastgpt/web/components/common/MyTooltip/QuestionTip';
|
||||
import ComplianceTip from '@/components/common/ComplianceTip/index';
|
||||
import MyIcon from '@fastgpt/web/components/common/Icon';
|
||||
import { getDocPath } from '@/web/common/system/doc';
|
||||
import { datasetTypeCourseMap } from '@/web/core/dataset/constants';
|
||||
import ApiDatasetForm from '../ApiDatasetForm';
|
||||
import { getWebDefaultEmbeddingModel, getWebDefaultLLMModel } from '@/web/common/system/utils';
|
||||
|
||||
@ -43,31 +42,6 @@ const CreateModal = ({
|
||||
const { defaultModels, embeddingModelList, datasetModelList, getVlmModelList } = useSystemStore();
|
||||
const { isPc } = useSystem();
|
||||
|
||||
const datasetTypeMap = useMemo(() => {
|
||||
return {
|
||||
[DatasetTypeEnum.dataset]: {
|
||||
name: t('dataset:common_dataset'),
|
||||
icon: 'core/dataset/commonDatasetColor'
|
||||
},
|
||||
[DatasetTypeEnum.websiteDataset]: {
|
||||
name: t('dataset:website_dataset'),
|
||||
icon: 'core/dataset/websiteDatasetColor'
|
||||
},
|
||||
[DatasetTypeEnum.apiDataset]: {
|
||||
name: t('dataset:api_file'),
|
||||
icon: 'core/dataset/externalDatasetColor'
|
||||
},
|
||||
[DatasetTypeEnum.feishu]: {
|
||||
name: t('dataset:feishu_dataset'),
|
||||
icon: 'core/dataset/feishuDatasetColor'
|
||||
},
|
||||
[DatasetTypeEnum.yuque]: {
|
||||
name: t('dataset:yuque_dataset'),
|
||||
icon: 'core/dataset/yuqueDatasetColor'
|
||||
}
|
||||
};
|
||||
}, [t]);
|
||||
|
||||
const filterNotHiddenVectorModelList = embeddingModelList.filter((item) => !item.hidden);
|
||||
|
||||
const vllmModelList = useMemo(() => getVlmModelList(), [getVlmModelList]);
|
||||
@ -76,7 +50,7 @@ const CreateModal = ({
|
||||
defaultValues: {
|
||||
parentId,
|
||||
type: type || DatasetTypeEnum.dataset,
|
||||
avatar: datasetTypeMap[type].icon,
|
||||
avatar: DatasetTypeMap[type].icon,
|
||||
name: '',
|
||||
intro: '',
|
||||
vectorModel:
|
||||
@ -121,10 +95,10 @@ const CreateModal = ({
|
||||
w={'20px'}
|
||||
h={'20px'}
|
||||
borderRadius={'xs'}
|
||||
src={datasetTypeMap[type].icon}
|
||||
src={DatasetTypeMap[type].icon}
|
||||
pr={'10px'}
|
||||
/>
|
||||
{t('common:core.dataset.Create dataset', { name: datasetTypeMap[type].name })}
|
||||
{t('common:core.dataset.Create dataset', { name: t(DatasetTypeMap[type].label) })}
|
||||
</Flex>
|
||||
}
|
||||
isOpen
|
||||
@ -138,14 +112,14 @@ const CreateModal = ({
|
||||
<Box color={'myGray.900'} fontWeight={500} fontSize={'sm'}>
|
||||
{t('common:input_name')}
|
||||
</Box>
|
||||
{datasetTypeCourseMap[type] && (
|
||||
{DatasetTypeMap[type]?.courseUrl && (
|
||||
<Flex
|
||||
as={'span'}
|
||||
alignItems={'center'}
|
||||
color={'primary.600'}
|
||||
fontSize={'sm'}
|
||||
cursor={'pointer'}
|
||||
onClick={() => window.open(getDocPath(datasetTypeCourseMap[type]), '_blank')}
|
||||
onClick={() => window.open(getDocPath(DatasetTypeMap[type].courseUrl!), '_blank')}
|
||||
>
|
||||
<MyIcon name={'book'} w={4} mr={0.5} />
|
||||
{t('common:Instructions')}
|
||||
|
||||
@ -1,41 +1,14 @@
|
||||
import { Box, Flex, type FlexProps } from '@chakra-ui/react';
|
||||
import { DatasetTypeEnum } from '@fastgpt/global/core/dataset/constants';
|
||||
import { DatasetTypeEnum, DatasetTypeMap } from '@fastgpt/global/core/dataset/constants';
|
||||
import MyIcon from '@fastgpt/web/components/common/Icon';
|
||||
import React, { useMemo } from 'react';
|
||||
import React from 'react';
|
||||
import { useTranslation } from 'next-i18next';
|
||||
|
||||
const SideTag = ({ type, ...props }: { type: `${DatasetTypeEnum}` } & FlexProps) => {
|
||||
if (type === DatasetTypeEnum.folder) return null;
|
||||
const { t } = useTranslation();
|
||||
const DatasetListTypeMap = useMemo(() => {
|
||||
return {
|
||||
[DatasetTypeEnum.dataset]: {
|
||||
icon: 'core/dataset/commonDatasetOutline',
|
||||
label: t('dataset:common_dataset')
|
||||
},
|
||||
[DatasetTypeEnum.websiteDataset]: {
|
||||
icon: 'core/dataset/websiteDatasetOutline',
|
||||
label: t('dataset:website_dataset')
|
||||
},
|
||||
[DatasetTypeEnum.externalFile]: {
|
||||
icon: 'core/dataset/externalDatasetOutline',
|
||||
label: t('dataset:external_file')
|
||||
},
|
||||
[DatasetTypeEnum.apiDataset]: {
|
||||
icon: 'core/dataset/externalDatasetOutline',
|
||||
label: t('dataset:api_file')
|
||||
},
|
||||
[DatasetTypeEnum.feishu]: {
|
||||
icon: 'core/dataset/feishuDatasetOutline',
|
||||
label: t('dataset:feishu_dataset')
|
||||
},
|
||||
[DatasetTypeEnum.yuque]: {
|
||||
icon: 'core/dataset/yuqueDatasetOutline',
|
||||
label: t('dataset:yuque_dataset')
|
||||
}
|
||||
};
|
||||
}, [t]);
|
||||
const item = DatasetListTypeMap[type] || DatasetListTypeMap['dataset'];
|
||||
|
||||
const item = DatasetTypeMap[type] || DatasetTypeMap['dataset'];
|
||||
|
||||
return (
|
||||
<Flex
|
||||
@ -50,7 +23,7 @@ const SideTag = ({ type, ...props }: { type: `${DatasetTypeEnum}` } & FlexProps)
|
||||
>
|
||||
<MyIcon name={item.icon as any} w={'0.8rem'} color={'myGray.400'} />
|
||||
<Box fontSize={'mini'} ml={1}>
|
||||
{item.label}
|
||||
{t(item.label)}
|
||||
</Box>
|
||||
</Flex>
|
||||
);
|
||||
|
||||
38
projects/app/src/pages/api/admin/initv4911.ts
Normal file
@ -0,0 +1,38 @@
|
||||
import { NextAPI } from '@/service/middleware/entry';
|
||||
import { authCert } from '@fastgpt/service/support/permission/auth/common';
|
||||
import { type NextApiRequest, type NextApiResponse } from 'next';
|
||||
|
||||
import { MongoDataset } from '@fastgpt/service/core/dataset/schema';
|
||||
import { DatasetTypeEnum } from '@fastgpt/global/core/dataset/constants';
|
||||
|
||||
async function handler(req: NextApiRequest, _res: NextApiResponse) {
|
||||
await authCert({ req, authRoot: true });
|
||||
|
||||
console.log('更新所有 API 知识库');
|
||||
|
||||
const datasets = await MongoDataset.find({
|
||||
type: {
|
||||
$in: [DatasetTypeEnum.apiDataset, DatasetTypeEnum.feishu, DatasetTypeEnum.yuque]
|
||||
}
|
||||
}).lean();
|
||||
|
||||
for (const dataset of datasets) {
|
||||
console.log(dataset._id);
|
||||
await MongoDataset.updateOne(
|
||||
{ _id: dataset._id },
|
||||
{
|
||||
$set: {
|
||||
apiDatasetServer: {
|
||||
...(dataset.apiServer && { apiServer: dataset.apiServer }),
|
||||
...(dataset.feishuServer && { feishuServer: dataset.feishuServer }),
|
||||
...(dataset.yuqueServer && { yuqueServer: dataset.yuqueServer })
|
||||
}
|
||||
}
|
||||
}
|
||||
);
|
||||
}
|
||||
|
||||
return { success: true };
|
||||
}
|
||||
|
||||
export default NextAPI(handler);
|
||||
@ -1,35 +1,34 @@
|
||||
import { getApiDatasetRequest } from '@fastgpt/service/core/dataset/apiDataset';
|
||||
import { NextAPI } from '@/service/middleware/entry';
|
||||
import type { ParentIdType } from '@fastgpt/global/common/parentFolder/type';
|
||||
import type {
|
||||
APIFileItem,
|
||||
APIFileServer,
|
||||
YuqueServer,
|
||||
FeishuServer
|
||||
} from '@fastgpt/global/core/dataset/apiDataset';
|
||||
import { type NextApiRequest } from 'next';
|
||||
import { authCert } from '@fastgpt/service/support/permission/auth/common';
|
||||
import type {
|
||||
ApiDatasetServerType,
|
||||
APIFileItem
|
||||
} from '@fastgpt/global/core/dataset/apiDataset/type';
|
||||
|
||||
export type GetApiDatasetCataLogProps = {
|
||||
parentId?: ParentIdType;
|
||||
yuqueServer?: YuqueServer;
|
||||
feishuServer?: FeishuServer;
|
||||
apiServer?: APIFileServer;
|
||||
apiDatasetServer?: ApiDatasetServerType;
|
||||
};
|
||||
|
||||
export type GetApiDatasetCataLogResponse = APIFileItem[];
|
||||
|
||||
async function handler(req: NextApiRequest) {
|
||||
let { searchKey = '', parentId = null, yuqueServer, feishuServer, apiServer } = req.body;
|
||||
let { searchKey = '', parentId = null, apiDatasetServer } = req.body;
|
||||
|
||||
await authCert({ req, authToken: true });
|
||||
|
||||
// Remove basePath from apiDatasetServer
|
||||
Object.values(apiDatasetServer).forEach((server: any) => {
|
||||
if (server.basePath) {
|
||||
delete server.basePath;
|
||||
}
|
||||
});
|
||||
|
||||
const data = await (
|
||||
await getApiDatasetRequest({
|
||||
feishuServer,
|
||||
yuqueServer,
|
||||
apiServer
|
||||
})
|
||||
await getApiDatasetRequest(apiDatasetServer)
|
||||
).listFiles({ parentId, searchKey });
|
||||
|
||||
return data?.filter((item: APIFileItem) => item.hasChild === true) || [];
|
||||
|
||||
@ -2,11 +2,9 @@ import { NextAPI } from '@/service/middleware/entry';
|
||||
import { DatasetErrEnum } from '@fastgpt/global/common/error/code/dataset';
|
||||
import type { ParentIdType } from '@fastgpt/global/common/parentFolder/type';
|
||||
import type {
|
||||
APIFileServer,
|
||||
YuqueServer,
|
||||
FeishuServer,
|
||||
ApiDatasetDetailResponse
|
||||
} from '@fastgpt/global/core/dataset/apiDataset';
|
||||
ApiDatasetDetailResponse,
|
||||
ApiDatasetServerType
|
||||
} from '@fastgpt/global/core/dataset/apiDataset/type';
|
||||
import { getApiDatasetRequest } from '@fastgpt/service/core/dataset/apiDataset';
|
||||
import type { ApiRequestProps, ApiResponseType } from '@fastgpt/service/type/next';
|
||||
import { authCert } from '@fastgpt/service/support/permission/auth/common';
|
||||
@ -18,9 +16,7 @@ export type GetApiDatasetPathQuery = {};
|
||||
export type GetApiDatasetPathBody = {
|
||||
datasetId?: string;
|
||||
parentId?: ParentIdType;
|
||||
yuqueServer?: YuqueServer;
|
||||
feishuServer?: FeishuServer;
|
||||
apiServer?: APIFileServer;
|
||||
apiDatasetServer?: ApiDatasetServerType;
|
||||
};
|
||||
|
||||
export type GetApiDatasetPathResponse = string;
|
||||
@ -50,7 +46,7 @@ async function handler(
|
||||
const { datasetId, parentId } = req.body;
|
||||
if (!parentId) return '';
|
||||
|
||||
const { yuqueServer, feishuServer, apiServer } = await (async () => {
|
||||
const apiDatasetServer = await (async () => {
|
||||
if (datasetId) {
|
||||
const { dataset } = await authDataset({
|
||||
req,
|
||||
@ -60,49 +56,21 @@ async function handler(
|
||||
datasetId
|
||||
});
|
||||
|
||||
return {
|
||||
yuqueServer: req.body.yuqueServer
|
||||
? { ...req.body.yuqueServer, token: dataset.yuqueServer?.token ?? '' }
|
||||
: dataset.yuqueServer,
|
||||
feishuServer: req.body.feishuServer
|
||||
? { ...req.body.feishuServer, appSecret: dataset.feishuServer?.appSecret ?? '' }
|
||||
: dataset.feishuServer,
|
||||
apiServer: req.body.apiServer
|
||||
? {
|
||||
...req.body.apiServer,
|
||||
authorization: dataset.apiServer?.authorization ?? ''
|
||||
}
|
||||
: dataset.apiServer
|
||||
};
|
||||
return dataset.apiDatasetServer;
|
||||
} else {
|
||||
await authCert({ req, authToken: true });
|
||||
|
||||
return {
|
||||
yuqueServer: req.body.yuqueServer,
|
||||
feishuServer: req.body.feishuServer,
|
||||
apiServer: req.body.apiServer
|
||||
};
|
||||
return req.body.apiDatasetServer;
|
||||
}
|
||||
})();
|
||||
|
||||
if (feishuServer) {
|
||||
return '';
|
||||
const apiDataset = await getApiDatasetRequest(apiDatasetServer);
|
||||
|
||||
if (!apiDataset?.getFileDetail) {
|
||||
return Promise.reject(DatasetErrEnum.noApiServer);
|
||||
}
|
||||
|
||||
if (yuqueServer || apiServer) {
|
||||
const apiDataset = await getApiDatasetRequest({
|
||||
yuqueServer,
|
||||
apiServer
|
||||
});
|
||||
|
||||
if (!apiDataset?.getFileDetail) {
|
||||
return Promise.reject(DatasetErrEnum.noApiServer);
|
||||
}
|
||||
|
||||
return await getFullPath(parentId, apiDataset.getFileDetail);
|
||||
}
|
||||
|
||||
return Promise.reject(new Error(DatasetErrEnum.noApiServer));
|
||||
return await getFullPath(parentId, apiDataset.getFileDetail);
|
||||
}
|
||||
|
||||
export default NextAPI(handler);
|
||||
|
||||
@ -23,17 +23,7 @@ async function handler(req: NextApiRequest) {
|
||||
per: ReadPermissionVal
|
||||
});
|
||||
|
||||
const apiServer = dataset.apiServer;
|
||||
const feishuServer = dataset.feishuServer;
|
||||
const yuqueServer = dataset.yuqueServer;
|
||||
|
||||
return (
|
||||
await getApiDatasetRequest({
|
||||
apiServer,
|
||||
yuqueServer,
|
||||
feishuServer
|
||||
})
|
||||
).listFiles({ searchKey, parentId });
|
||||
return (await getApiDatasetRequest(dataset.apiDatasetServer)).listFiles({ searchKey, parentId });
|
||||
}
|
||||
|
||||
export default NextAPI(handler);
|
||||
|
||||
@ -23,10 +23,6 @@ async function handler(req: NextApiRequest): CreateCollectionResponse {
|
||||
per: WritePermissionVal
|
||||
});
|
||||
|
||||
const apiServer = dataset.apiServer;
|
||||
const feishuServer = dataset.feishuServer;
|
||||
const yuqueServer = dataset.yuqueServer;
|
||||
|
||||
// Auth same apiFileId
|
||||
const storeCol = await MongoDatasetCollection.findOne(
|
||||
{
|
||||
@ -42,9 +38,7 @@ async function handler(req: NextApiRequest): CreateCollectionResponse {
|
||||
}
|
||||
|
||||
const { title, rawText } = await readApiServerFileContent({
|
||||
apiServer,
|
||||
feishuServer,
|
||||
yuqueServer,
|
||||
apiDatasetServer: dataset.apiDatasetServer,
|
||||
apiFileId,
|
||||
teamId,
|
||||
tmbId,
|
||||
|
||||
@ -62,9 +62,7 @@ async function handler(
|
||||
return {
|
||||
type: DatasetSourceReadTypeEnum.apiFile,
|
||||
sourceId: collection.apiFileId,
|
||||
apiServer: collection.dataset.apiServer,
|
||||
feishuServer: collection.dataset.feishuServer,
|
||||
yuqueServer: collection.dataset.yuqueServer
|
||||
apiDatasetServer: collection.dataset.apiDatasetServer
|
||||
};
|
||||
}
|
||||
if (collection.type === DatasetCollectionTypeEnum.externalFile) {
|
||||
|
||||
@ -94,17 +94,7 @@ async function handler(
|
||||
return collection.rawLink;
|
||||
}
|
||||
if (collection.type === DatasetCollectionTypeEnum.apiFile && collection.apiFileId) {
|
||||
const apiServer = collection.dataset.apiServer;
|
||||
const feishuServer = collection.dataset.feishuServer;
|
||||
const yuqueServer = collection.dataset.yuqueServer;
|
||||
|
||||
return (
|
||||
await getApiDatasetRequest({
|
||||
apiServer,
|
||||
feishuServer,
|
||||
yuqueServer
|
||||
})
|
||||
).getFilePreviewUrl({
|
||||
return (await getApiDatasetRequest(collection.dataset.apiDatasetServer)).getFilePreviewUrl({
|
||||
apiFileId: collection.apiFileId
|
||||
});
|
||||
}
|
||||
|
||||
@ -38,9 +38,7 @@ async function handler(
|
||||
vectorModel = getDefaultEmbeddingModel()?.model,
|
||||
agentModel = getDatasetModel()?.model,
|
||||
vlmModel,
|
||||
apiServer,
|
||||
feishuServer,
|
||||
yuqueServer
|
||||
apiDatasetServer
|
||||
} = req.body;
|
||||
|
||||
// auth
|
||||
@ -86,9 +84,7 @@ async function handler(
|
||||
vlmModel,
|
||||
avatar,
|
||||
type,
|
||||
apiServer,
|
||||
feishuServer,
|
||||
yuqueServer
|
||||
apiDatasetServer
|
||||
}
|
||||
],
|
||||
{ session, ordered: true }
|
||||
|
||||
@ -7,6 +7,7 @@ import { type ApiRequestProps } from '@fastgpt/service/type/next';
|
||||
import { CommonErrEnum } from '@fastgpt/global/common/error/code/common';
|
||||
import { getWebsiteSyncDatasetStatus } from '@fastgpt/service/core/dataset/websiteSync';
|
||||
import { DatasetStatusEnum, DatasetTypeEnum } from '@fastgpt/global/core/dataset/constants';
|
||||
import { filterApiDatasetServerPublicData } from '@fastgpt/global/core/dataset/apiDataset/utils';
|
||||
|
||||
type Query = {
|
||||
id: string;
|
||||
@ -40,36 +41,16 @@ async function handler(req: ApiRequestProps<Query>): Promise<DatasetItemType> {
|
||||
errorMsg: undefined
|
||||
};
|
||||
})();
|
||||
|
||||
console.log(filterApiDatasetServerPublicData(dataset.apiDatasetServer));
|
||||
return {
|
||||
...dataset,
|
||||
status,
|
||||
errorMsg,
|
||||
apiServer: dataset.apiServer
|
||||
? {
|
||||
baseUrl: dataset.apiServer.baseUrl,
|
||||
authorization: '',
|
||||
basePath: dataset.apiServer.basePath
|
||||
}
|
||||
: undefined,
|
||||
yuqueServer: dataset.yuqueServer
|
||||
? {
|
||||
userId: dataset.yuqueServer.userId,
|
||||
token: '',
|
||||
basePath: dataset.yuqueServer.basePath
|
||||
}
|
||||
: undefined,
|
||||
feishuServer: dataset.feishuServer
|
||||
? {
|
||||
appId: dataset.feishuServer.appId,
|
||||
appSecret: '',
|
||||
folderToken: dataset.feishuServer.folderToken
|
||||
}
|
||||
: undefined,
|
||||
permission,
|
||||
vectorModel: getEmbeddingModel(dataset.vectorModel),
|
||||
agentModel: getLLMModel(dataset.agentModel),
|
||||
vlmModel: getVlmModel(dataset.vlmModel)
|
||||
vlmModel: getVlmModel(dataset.vlmModel),
|
||||
apiDatasetServer: filterApiDatasetServerPublicData(dataset.apiDatasetServer)
|
||||
};
|
||||
}
|
||||
|
||||
|
||||
@ -121,11 +121,9 @@ async function handler(
|
||||
type,
|
||||
sourceId,
|
||||
selector,
|
||||
apiServer: dataset.apiServer,
|
||||
feishuServer: dataset.feishuServer,
|
||||
yuqueServer: dataset.yuqueServer,
|
||||
externalFileId,
|
||||
customPdfParse
|
||||
customPdfParse,
|
||||
apiDatasetServer: dataset.apiDatasetServer
|
||||
});
|
||||
|
||||
const chunks = rawText2Chunks({
|
||||
|
||||
@ -69,9 +69,7 @@ async function handler(
|
||||
vlmModel,
|
||||
websiteConfig,
|
||||
externalReadUrl,
|
||||
apiServer,
|
||||
yuqueServer,
|
||||
feishuServer,
|
||||
apiDatasetServer,
|
||||
autoSync,
|
||||
chunkSettings
|
||||
} = req.body;
|
||||
@ -168,6 +166,37 @@ async function handler(
|
||||
await delDatasetRelevantData({ datasets: [dataset], session });
|
||||
}
|
||||
|
||||
const apiDatasetParams = (() => {
|
||||
if (!apiDatasetServer) return {};
|
||||
|
||||
const flattenObjectWithConditions = (
|
||||
obj: any,
|
||||
prefix = 'apiDatasetServer'
|
||||
): Record<string, any> => {
|
||||
const result: Record<string, any> = {};
|
||||
|
||||
if (!obj || typeof obj !== 'object') return result;
|
||||
|
||||
Object.keys(obj).forEach((key) => {
|
||||
const value = obj[key];
|
||||
const newKey = prefix ? `${prefix}.${key}` : key;
|
||||
|
||||
if (value) {
|
||||
if (typeof value === 'object' && !Array.isArray(value)) {
|
||||
// Recursively flatten nested objects
|
||||
Object.assign(result, flattenObjectWithConditions(value, newKey));
|
||||
} else {
|
||||
// Add non-empty primitive values
|
||||
result[newKey] = value;
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
return result;
|
||||
};
|
||||
return flattenObjectWithConditions(apiDatasetServer);
|
||||
})();
|
||||
|
||||
await MongoDataset.findByIdAndUpdate(
|
||||
id,
|
||||
{
|
||||
@ -180,23 +209,9 @@ async function handler(
|
||||
...(chunkSettings && { chunkSettings }),
|
||||
...(intro !== undefined && { intro }),
|
||||
...(externalReadUrl !== undefined && { externalReadUrl }),
|
||||
...(!!apiServer?.baseUrl && { 'apiServer.baseUrl': apiServer.baseUrl }),
|
||||
...(!!apiServer?.authorization && {
|
||||
'apiServer.authorization': apiServer.authorization
|
||||
}),
|
||||
...(!!apiServer?.basePath !== undefined && { 'apiServer.basePath': apiServer?.basePath }),
|
||||
...(!!yuqueServer?.userId && { 'yuqueServer.userId': yuqueServer.userId }),
|
||||
...(!!yuqueServer?.token && { 'yuqueServer.token': yuqueServer.token }),
|
||||
...(!!yuqueServer?.basePath !== undefined && {
|
||||
'yuqueServer.basePath': yuqueServer?.basePath
|
||||
}),
|
||||
...(!!feishuServer?.appId && { 'feishuServer.appId': feishuServer.appId }),
|
||||
...(!!feishuServer?.appSecret && { 'feishuServer.appSecret': feishuServer.appSecret }),
|
||||
...(!!feishuServer?.folderToken && {
|
||||
'feishuServer.folderToken': feishuServer.folderToken
|
||||
}),
|
||||
...(isMove && { inheritPermission: true }),
|
||||
...(typeof autoSync === 'boolean' && { autoSync })
|
||||
...(typeof autoSync === 'boolean' && { autoSync }),
|
||||
...apiDatasetParams
|
||||
},
|
||||
{ session }
|
||||
);
|
||||
|
||||
@ -69,7 +69,7 @@ import type {
|
||||
getTrainingErrorBody,
|
||||
getTrainingErrorResponse
|
||||
} from '@/pages/api/core/dataset/training/getTrainingError';
|
||||
import type { APIFileItem } from '@fastgpt/global/core/dataset/apiDataset';
|
||||
import type { APIFileItem } from '@fastgpt/global/core/dataset/apiDataset/type';
|
||||
import type { GetQuoteDataProps } from '@/pages/api/core/dataset/data/getQuoteData';
|
||||
import type {
|
||||
GetApiDatasetCataLogResponse,
|
||||
|
||||
@ -2,8 +2,7 @@ import { defaultQAModels, defaultVectorModels } from '@fastgpt/global/core/ai/mo
|
||||
import {
|
||||
DatasetCollectionDataProcessModeEnum,
|
||||
DatasetCollectionTypeEnum,
|
||||
DatasetTypeEnum,
|
||||
TrainingModeEnum
|
||||
DatasetTypeEnum
|
||||
} from '@fastgpt/global/core/dataset/constants';
|
||||
import type {
|
||||
DatasetCollectionItemType,
|
||||
@ -66,16 +65,6 @@ export const defaultCollectionDetail: DatasetCollectionItemType = {
|
||||
indexAmount: 0
|
||||
};
|
||||
|
||||
export const datasetTypeCourseMap: Record<`${DatasetTypeEnum}`, string> = {
|
||||
[DatasetTypeEnum.folder]: '',
|
||||
[DatasetTypeEnum.dataset]: '',
|
||||
[DatasetTypeEnum.apiDataset]: '/docs/guide/knowledge_base/api_dataset/',
|
||||
[DatasetTypeEnum.websiteDataset]: '/docs/guide/knowledge_base/websync/',
|
||||
[DatasetTypeEnum.feishu]: '/docs/guide/knowledge_base/lark_dataset/',
|
||||
[DatasetTypeEnum.yuque]: '/docs/guide/knowledge_base/yuque_dataset/',
|
||||
[DatasetTypeEnum.externalFile]: ''
|
||||
};
|
||||
|
||||
export const TrainingProcess = {
|
||||
waiting: {
|
||||
label: i18nT('dataset:process.Waiting'),
|
||||
|
||||
@ -18,6 +18,7 @@ import { useSystemStore } from '@/web/common/system/useSystemStore';
|
||||
import { type ParentTreePathItemType } from '@fastgpt/global/common/parentFolder/type';
|
||||
import { useRequest2 } from '@fastgpt/web/hooks/useRequest';
|
||||
import { getWebLLMModel } from '@/web/common/system/utils';
|
||||
import { filterApiDatasetServerPublicData } from '@fastgpt/global/core/dataset/apiDataset/utils';
|
||||
|
||||
type DatasetPageContextType = {
|
||||
datasetId: string;
|
||||
@ -103,27 +104,7 @@ export const DatasetPageContextProvider = ({
|
||||
...data,
|
||||
agentModel: data.agentModel ? getWebLLMModel(data.agentModel) : state.agentModel,
|
||||
vlmModel: data.vlmModel ? getWebLLMModel(data.vlmModel) : state.vlmModel,
|
||||
apiServer: data.apiServer
|
||||
? {
|
||||
baseUrl: data.apiServer.baseUrl,
|
||||
authorization: '',
|
||||
basePath: data.apiServer.basePath
|
||||
}
|
||||
: state.apiServer,
|
||||
yuqueServer: data.yuqueServer
|
||||
? {
|
||||
userId: data.yuqueServer.userId,
|
||||
token: '',
|
||||
basePath: data.yuqueServer.basePath
|
||||
}
|
||||
: state.yuqueServer,
|
||||
feishuServer: data.feishuServer
|
||||
? {
|
||||
appId: data.feishuServer.appId,
|
||||
appSecret: '',
|
||||
folderToken: data.feishuServer.folderToken
|
||||
}
|
||||
: state.feishuServer
|
||||
apiDatasetServer: filterApiDatasetServerPublicData(data.apiDatasetServer)
|
||||
}));
|
||||
}
|
||||
};
|
||||
|
||||
2
projects/app/src/web/core/dataset/type.d.ts
vendored
@ -2,7 +2,7 @@ import type { PushDatasetDataChunkProps } from '@fastgpt/global/core/dataset/api
|
||||
import type { TrainingModeEnum } from '@fastgpt/global/core/dataset/constants';
|
||||
import type { ChunkSettingModeEnum } from '@fastgpt/global/core/dataset/constants';
|
||||
import type { UseFormReturn } from 'react-hook-form';
|
||||
import type { APIFileItem } from '@fastgpt/global/core/dataset/apiDataset';
|
||||
import type { APIFileItem } from '@fastgpt/global/core/dataset/apiDataset/type';
|
||||
|
||||
export type ImportSourceItemType = {
|
||||
id: string;
|
||||
|
||||