fix: 修复中介导入成功条数计算错误
问题: - 导入成功条数显示为负数 - 原因:成功数量计算使用 validRecords.size() - failures.size() - 但没有使用实际的数据库操作返回值 修复: - saveBatchWithUpsert 和 saveBatch 方法现在返回 int - 累加实际的数据库影响行数 - 使用 actualSuccessCount 变量跟踪真实成功数量 影响范围: - CcdiIntermediaryPersonImportServiceImpl - CcdiIntermediaryEntityImportServiceImpl
This commit is contained in:
@@ -87,7 +87,12 @@
|
||||
"Bash(git show:*)",
|
||||
"Bash(git rebase:*)",
|
||||
"Bash(git stash:*)",
|
||||
"Bash(git checkout:*)"
|
||||
"Bash(git checkout:*)",
|
||||
"Bash(git check-ignore:*)",
|
||||
"Bash(git worktree add:*)",
|
||||
"Bash(xmllint:*)",
|
||||
"Bash(git worktree remove:*)",
|
||||
"Bash(git branch:*)"
|
||||
]
|
||||
},
|
||||
"enabledMcpjsonServers": [
|
||||
|
||||
489
doc/intermediary-import-failure-view-design.md
Normal file
489
doc/intermediary-import-failure-view-design.md
Normal file
@@ -0,0 +1,489 @@
|
||||
# 中介库导入失败记录查看功能设计
|
||||
|
||||
## 1. 需求背景
|
||||
|
||||
当前中介库导入功能在导入失败后,只显示通知消息,但没有提供查看失败记录的入口,用户无法了解具体哪些数据导入失败以及失败原因。
|
||||
|
||||
## 2. 功能描述
|
||||
|
||||
为中介库管理页面添加**导入失败记录查看**功能,支持个人中介和实体中介两种类型的失败记录查看。
|
||||
|
||||
### 2.1 核心功能
|
||||
|
||||
1. **双按钮独立管理**
|
||||
- "查看个人导入失败记录"按钮 - 仅在个人中介导入存在失败记录时显示
|
||||
- "查看实体导入失败记录"按钮 - 仅在实体中介导入存在失败记录时显示
|
||||
- 按钮带tooltip提示上次导入时间
|
||||
|
||||
2. **localStorage持久化存储**
|
||||
- 分别存储个人中介和实体中介的导入任务信息
|
||||
- 存储期限:7天,过期自动清除
|
||||
- 存储内容:任务ID、导入时间、成功数、失败数、hasFailures标志
|
||||
|
||||
3. **失败记录对话框**
|
||||
- 显示导入统计摘要(总数/成功/失败)
|
||||
- 表格展示所有失败记录,支持分页(每页10条)
|
||||
- 提供清除历史记录按钮
|
||||
- 记录过期时自动提示并清除
|
||||
|
||||
## 3. 技术设计
|
||||
|
||||
### 3.1 组件结构
|
||||
|
||||
```
|
||||
index.vue (中介库管理页面)
|
||||
├── 工具栏按钮区域
|
||||
│ ├── 新增按钮
|
||||
│ ├── 导入按钮
|
||||
│ ├── 查看个人导入失败记录按钮 (条件显示)
|
||||
│ └── 查看实体导入失败记录按钮 (条件显示)
|
||||
├── 数据表格
|
||||
├── 个人中介导入失败记录对话框
|
||||
└── 实体中介导入失败记录对话框
|
||||
```
|
||||
|
||||
### 3.2 数据流程
|
||||
|
||||
```
|
||||
用户选择文件上传
|
||||
↓
|
||||
ImportDialog 组件提交导入
|
||||
↓
|
||||
后端返回 taskId (异步处理)
|
||||
↓
|
||||
前端开始轮询导入状态
|
||||
↓
|
||||
导入完成,ImportDialog 触发 @import-complete 事件
|
||||
↓
|
||||
index.vue 接收事件,根据 importType 判断类型
|
||||
↓
|
||||
保存任务信息到 localStorage (person 或 entity)
|
||||
↓
|
||||
更新对应的失败记录按钮显示状态
|
||||
↓
|
||||
用户点击"查看失败记录"按钮
|
||||
↓
|
||||
调用后端接口获取失败记录列表 (支持分页)
|
||||
↓
|
||||
在对话框中展示失败记录和错误原因
|
||||
```
|
||||
|
||||
### 3.3 localStorage存储设计
|
||||
|
||||
#### 3.3.1 个人中介导入任务
|
||||
|
||||
**Key**: `intermediary_person_import_last_task`
|
||||
|
||||
**数据结构**:
|
||||
```javascript
|
||||
{
|
||||
taskId: "uuid", // 任务ID
|
||||
saveTime: 1234567890, // 保存时间戳
|
||||
hasFailures: true, // 是否有失败记录
|
||||
totalCount: 100, // 总数
|
||||
successCount: 95, // 成功数
|
||||
failureCount: 5 // 失败数
|
||||
}
|
||||
```
|
||||
|
||||
#### 3.3.2 实体中介导入任务
|
||||
|
||||
**Key**: `intermediary_entity_import_last_task`
|
||||
|
||||
**数据结构**: 同个人中介
|
||||
|
||||
### 3.4 页面状态管理
|
||||
|
||||
```javascript
|
||||
data() {
|
||||
return {
|
||||
// 按钮显示状态
|
||||
showPersonFailureButton: false,
|
||||
showEntityFailureButton: false,
|
||||
|
||||
// 当前任务ID
|
||||
currentPersonTaskId: null,
|
||||
currentEntityTaskId: null,
|
||||
|
||||
// 个人失败记录对话框
|
||||
personFailureDialogVisible: false,
|
||||
personFailureList: [],
|
||||
personFailureLoading: false,
|
||||
personFailureTotal: 0,
|
||||
personFailureQueryParams: {
|
||||
pageNum: 1,
|
||||
pageSize: 10
|
||||
},
|
||||
|
||||
// 实体失败记录对话框
|
||||
entityFailureDialogVisible: false,
|
||||
entityFailureList: [],
|
||||
entityFailureLoading: false,
|
||||
entityFailureTotal: 0,
|
||||
entityFailureQueryParams: {
|
||||
pageNum: 1,
|
||||
pageSize: 10
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## 4. 接口依赖
|
||||
|
||||
### 4.1 已有后端接口
|
||||
|
||||
#### 4.1.1 查询个人中介导入失败记录
|
||||
|
||||
**接口**: `GET /ccdi/intermediary/importPersonFailures/{taskId}`
|
||||
|
||||
**参数**:
|
||||
- `taskId`: 任务ID (路径参数)
|
||||
- `pageNum`: 页码 (默认1)
|
||||
- `pageSize`: 每页大小 (默认10)
|
||||
|
||||
**返回**: `IntermediaryPersonImportFailureVO[]`
|
||||
|
||||
**字段**:
|
||||
- `name`: 姓名
|
||||
- `personId`: 证件号码
|
||||
- `personType`: 人员类型
|
||||
- `gender`: 性别
|
||||
- `mobile`: 手机号码
|
||||
- `company`: 所在公司
|
||||
- `errorMessage`: 错误信息
|
||||
|
||||
#### 4.1.2 查询实体中介导入失败记录
|
||||
|
||||
**接口**: `GET /ccdi/intermediary/importEntityFailures/{taskId}`
|
||||
|
||||
**参数**:
|
||||
- `taskId`: 任务ID (路径参数)
|
||||
- `pageNum`: 页码 (默认1)
|
||||
- `pageSize`: 每页大小 (默认10)
|
||||
|
||||
**返回**: `IntermediaryEntityImportFailureVO[]`
|
||||
|
||||
**字段**:
|
||||
- `enterpriseName`: 机构名称
|
||||
- `socialCreditCode`: 统一社会信用代码
|
||||
- `enterpriseType`: 主体类型
|
||||
- `enterpriseNature`: 企业性质
|
||||
- `legalRepresentative`: 法定代表人
|
||||
- `establishDate`: 成立日期
|
||||
- `errorMessage`: 错误信息
|
||||
|
||||
### 4.2 前端API方法
|
||||
|
||||
已有API方法 (位于 `@/api/ccdiIntermediary.js`):
|
||||
- `getPersonImportFailures(taskId, pageNum, pageSize)` - 查询个人导入失败记录
|
||||
- `getEntityImportFailures(taskId, pageNum, pageSize)` - 查询实体导入失败记录
|
||||
|
||||
## 5. UI设计
|
||||
|
||||
### 5.1 工具栏按钮
|
||||
|
||||
```vue
|
||||
<el-col :span="1.5" v-if="showPersonFailureButton">
|
||||
<el-tooltip :content="getPersonImportTooltip()" placement="top">
|
||||
<el-button
|
||||
type="warning"
|
||||
plain
|
||||
icon="el-icon-warning"
|
||||
size="mini"
|
||||
@click="viewPersonImportFailures"
|
||||
>查看个人导入失败记录</el-button>
|
||||
</el-tooltip>
|
||||
</el-col>
|
||||
|
||||
<el-col :span="1.5" v-if="showEntityFailureButton">
|
||||
<el-tooltip :content="getEntityImportTooltip()" placement="top">
|
||||
<el-button
|
||||
type="warning"
|
||||
plain
|
||||
icon="el-icon-warning"
|
||||
size="mini"
|
||||
@click="viewEntityImportFailures"
|
||||
>查看实体导入失败记录</el-button>
|
||||
</el-tooltip>
|
||||
</el-col>
|
||||
```
|
||||
|
||||
### 5.2 失败记录对话框
|
||||
|
||||
**个人中介失败记录对话框**:
|
||||
- 标题: "个人中介导入失败记录"
|
||||
- 顶部提示: 显示导入统计信息
|
||||
- 表格列: 姓名、证件号码、人员类型、性别、手机号码、所在公司、**失败原因**(最小宽度200px,溢出显示tooltip)
|
||||
- 分页组件: 支持翻页
|
||||
- 底部按钮: "关闭"、"清除历史记录"
|
||||
|
||||
**实体中介失败记录对话框**:
|
||||
- 标题: "实体中介导入失败记录"
|
||||
- 顶部提示: 显示导入统计信息
|
||||
- 表格列: 机构名称、统一社会信用代码、主体类型、企业性质、法定代表人、成立日期、**失败原因**(最小宽度200px,溢出显示tooltip)
|
||||
- 分页组件: 支持翻页
|
||||
- 底部按钮: "关闭"、"清除历史记录"
|
||||
|
||||
## 6. 核心方法设计
|
||||
|
||||
### 6.1 localStorage管理方法
|
||||
|
||||
#### 6.1.1 个人中介导入任务
|
||||
|
||||
```javascript
|
||||
/** 保存个人导入任务到localStorage */
|
||||
savePersonImportTaskToStorage(taskData) {
|
||||
const data = {
|
||||
...taskData,
|
||||
saveTime: Date.now()
|
||||
}
|
||||
localStorage.setItem('intermediary_person_import_last_task', JSON.stringify(data))
|
||||
}
|
||||
|
||||
/** 从localStorage读取个人导入任务 */
|
||||
getPersonImportTaskFromStorage() {
|
||||
try {
|
||||
const data = localStorage.getItem('intermediary_person_import_last_task')
|
||||
if (!data) return null
|
||||
|
||||
const task = JSON.parse(data)
|
||||
|
||||
// 7天过期检查
|
||||
const sevenDays = 7 * 24 * 60 * 60 * 1000
|
||||
if (Date.now() - task.saveTime > sevenDays) {
|
||||
this.clearPersonImportTaskFromStorage()
|
||||
return null
|
||||
}
|
||||
|
||||
return task
|
||||
} catch (error) {
|
||||
console.error('读取个人导入任务失败:', error)
|
||||
this.clearPersonImportTaskFromStorage()
|
||||
return null
|
||||
}
|
||||
}
|
||||
|
||||
/** 清除个人导入任务 */
|
||||
clearPersonImportTaskFromStorage() {
|
||||
localStorage.removeItem('intermediary_person_import_last_task')
|
||||
}
|
||||
```
|
||||
|
||||
#### 6.1.2 实体中介导入任务
|
||||
|
||||
结构同个人中介,方法名为:
|
||||
- `saveEntityImportTaskToStorage(taskData)`
|
||||
- `getEntityImportTaskFromStorage()`
|
||||
- `clearEntityImportTaskFromStorage()`
|
||||
|
||||
### 6.2 导入完成处理
|
||||
|
||||
```javascript
|
||||
/** 处理导入完成 */
|
||||
handleImportComplete(importData) {
|
||||
const { taskId, hasFailures, importType, totalCount, successCount, failureCount } = importData
|
||||
|
||||
if (importType === 'person') {
|
||||
// 保存个人导入任务
|
||||
this.savePersonImportTaskToStorage({
|
||||
taskId,
|
||||
hasFailures,
|
||||
totalCount,
|
||||
successCount,
|
||||
failureCount
|
||||
})
|
||||
|
||||
// 更新按钮显示
|
||||
this.showPersonFailureButton = hasFailures
|
||||
this.currentPersonTaskId = taskId
|
||||
|
||||
} else if (importType === 'entity') {
|
||||
// 保存实体导入任务
|
||||
this.saveEntityImportTaskToStorage({
|
||||
taskId,
|
||||
hasFailures,
|
||||
totalCount,
|
||||
successCount,
|
||||
failureCount
|
||||
})
|
||||
|
||||
// 更新按钮显示
|
||||
this.showEntityFailureButton = hasFailures
|
||||
this.currentEntityTaskId = taskId
|
||||
}
|
||||
|
||||
// 刷新列表
|
||||
this.getList()
|
||||
}
|
||||
```
|
||||
|
||||
### 6.3 查看失败记录
|
||||
|
||||
```javascript
|
||||
/** 查看个人导入失败记录 */
|
||||
viewPersonImportFailures() {
|
||||
this.personFailureDialogVisible = true
|
||||
this.getPersonFailureList()
|
||||
}
|
||||
|
||||
/** 查询个人失败记录列表 */
|
||||
getPersonFailureList() {
|
||||
this.personFailureLoading = true
|
||||
getPersonImportFailures(
|
||||
this.currentPersonTaskId,
|
||||
this.personFailureQueryParams.pageNum,
|
||||
this.personFailureQueryParams.pageSize
|
||||
).then(response => {
|
||||
this.personFailureList = response.rows
|
||||
this.personFailureTotal = response.total
|
||||
this.personFailureLoading = false
|
||||
}).catch(error => {
|
||||
this.personFailureLoading = false
|
||||
// 错误处理: 404表示记录已过期
|
||||
if (error.response?.status === 404) {
|
||||
this.$modal.msgWarning('导入记录已过期,无法查看失败记录')
|
||||
this.clearPersonImportTaskFromStorage()
|
||||
this.showPersonFailureButton = false
|
||||
this.personFailureDialogVisible = false
|
||||
} else {
|
||||
this.$modal.msgError('查询失败记录失败')
|
||||
}
|
||||
})
|
||||
}
|
||||
```
|
||||
|
||||
### 6.4 清除历史记录
|
||||
|
||||
```javascript
|
||||
/** 清除个人导入历史记录 */
|
||||
clearPersonImportHistory() {
|
||||
this.$confirm('确认清除上次导入记录?', '提示', {
|
||||
confirmButtonText: '确定',
|
||||
cancelButtonText: '取消',
|
||||
type: 'warning'
|
||||
}).then(() => {
|
||||
this.clearPersonImportTaskFromStorage()
|
||||
this.showPersonFailureButton = false
|
||||
this.currentPersonTaskId = null
|
||||
this.personFailureDialogVisible = false
|
||||
this.$message.success('已清除')
|
||||
}).catch(() => {})
|
||||
}
|
||||
```
|
||||
|
||||
## 7. 生命周期管理
|
||||
|
||||
### 7.1 created钩子
|
||||
|
||||
```javascript
|
||||
created() {
|
||||
this.getList()
|
||||
this.loadEnumOptions()
|
||||
this.restoreImportState() // 恢复导入状态
|
||||
}
|
||||
```
|
||||
|
||||
### 7.2 恢复导入状态
|
||||
|
||||
```javascript
|
||||
/** 恢复导入状态 */
|
||||
restoreImportState() {
|
||||
// 恢复个人中介导入状态
|
||||
const personTask = this.getPersonImportTaskFromStorage()
|
||||
if (personTask && personTask.hasFailures && personTask.taskId) {
|
||||
this.currentPersonTaskId = personTask.taskId
|
||||
this.showPersonFailureButton = true
|
||||
}
|
||||
|
||||
// 恢复实体中介导入状态
|
||||
const entityTask = this.getEntityImportTaskFromStorage()
|
||||
if (entityTask && entityTask.hasFailures && entityTask.taskId) {
|
||||
this.currentEntityTaskId = entityTask.taskId
|
||||
this.showEntityFailureButton = true
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## 8. 边界情况处理
|
||||
|
||||
### 8.1 记录过期
|
||||
|
||||
- localStorage中存储的记录超过7天,自动清除
|
||||
- 后端接口返回404时,提示用户"导入记录已过期",并清除本地存储
|
||||
- 清除后隐藏对应的"查看失败记录"按钮
|
||||
|
||||
### 8.2 并发导入
|
||||
|
||||
- 每次新导入开始前,清除旧的导入记录
|
||||
- 同一类型的导入进行时,取消之前的轮询
|
||||
- 只保留最近一次的导入任务信息
|
||||
|
||||
### 8.3 网络错误
|
||||
|
||||
- 查询失败记录时网络错误,显示友好的错误提示
|
||||
- 不影响页面其他功能的正常使用
|
||||
|
||||
## 9. 测试要点
|
||||
|
||||
### 9.1 功能测试
|
||||
|
||||
1. **个人中介导入失败场景**
|
||||
- 导入包含错误数据的Excel文件
|
||||
- 验证失败记录按钮是否显示
|
||||
- 点击按钮查看失败记录
|
||||
- 验证失败原因是否正确显示
|
||||
|
||||
2. **实体中介导入失败场景**
|
||||
- 导入包含错误数据的Excel文件
|
||||
- 验证失败记录按钮是否显示
|
||||
- 点击按钮查看失败记录
|
||||
- 验证失败原因是否正确显示
|
||||
|
||||
3. **localStorage持久化**
|
||||
- 导入失败后刷新页面
|
||||
- 验证"查看失败记录"按钮是否仍然显示
|
||||
- 验证点击后能否正常查看失败记录
|
||||
|
||||
4. **分页功能**
|
||||
- 失败记录超过10条时
|
||||
- 验证分页组件是否正常工作
|
||||
- 验证翻页后数据是否正确
|
||||
|
||||
5. **清除历史记录**
|
||||
- 点击"清除历史记录"按钮
|
||||
- 验证localStorage是否清除
|
||||
- 验证按钮是否隐藏
|
||||
- 再次点击导入,验证新记录是否正常
|
||||
|
||||
6. **记录过期处理**
|
||||
- 手动修改localStorage中的saveTime模拟过期
|
||||
- 刷新页面,验证按钮是否隐藏
|
||||
- 或点击查看,验证是否提示"记录已过期"
|
||||
|
||||
### 9.2 兼容性测试
|
||||
|
||||
1. **浏览器兼容性**
|
||||
- Chrome
|
||||
- Firefox
|
||||
- Edge
|
||||
- Safari
|
||||
|
||||
2. **数据量大时性能测试**
|
||||
- 导入1000条数据,其中100条失败
|
||||
- 验证查询速度和渲染性能
|
||||
|
||||
## 10. 参考实现
|
||||
|
||||
本设计参考了员工管理页面 (`ccdiEmployee/index.vue`) 的导入失败记录查看功能的实现,主要参考点:
|
||||
|
||||
1. localStorage存储模式
|
||||
2. 失败记录对话框布局
|
||||
3. 分页查询逻辑
|
||||
4. 错误处理机制
|
||||
5. 过期记录清理逻辑
|
||||
|
||||
## 11. 变更历史
|
||||
|
||||
| 日期 | 版本 | 变更内容 | 作者 |
|
||||
|------|------|----------|------|
|
||||
| 2026-02-08 | 1.0 | 初始设计 | Claude |
|
||||
@@ -0,0 +1,468 @@
|
||||
# 中介导入功能优化设计文档
|
||||
|
||||
## 概述
|
||||
|
||||
本设计文档描述了如何使用 MySQL 的 `INSERT ... ON DUPLICATE KEY UPDATE` 语句优化中介信息导入功能,替代现有的"先删除再插入"更新模式,提升性能并简化代码逻辑。
|
||||
|
||||
**设计日期**: 2026-02-08
|
||||
**目标**: 优化个人中介和实体中介的批量导入性能
|
||||
**核心改进**: 使用 `ON DUPLICATE KEY UPDATE` 实现 Upsert 操作
|
||||
|
||||
---
|
||||
|
||||
## 一、整体架构设计
|
||||
|
||||
### 1.1 核心变更
|
||||
|
||||
**保持现有架构:**
|
||||
- Controller 层:`CcdiIntermediaryController` - 无需修改
|
||||
- Service 层:`CcdiIntermediaryServiceImpl` - 简化逻辑
|
||||
- Mapper 层:新增批量导入方法
|
||||
|
||||
**架构优化点:**
|
||||
|
||||
| 层级 | 现有方案 | 优化方案 | 改进点 |
|
||||
|------|----------|----------|--------|
|
||||
| Mapper | `insertBatch` + `delete` | `importBatch` (ON DUPLICATE KEY UPDATE) | 单次SQL完成插入或更新 |
|
||||
| Service | 查询→分类→删除→插入 | 验证→直接导入 | 减少50%代码量 |
|
||||
| 数据库 | 2-3次操作 | 1次操作 | 减少30-40%响应时间 |
|
||||
|
||||
### 1.2 数据流变化
|
||||
|
||||
**优化前流程:**
|
||||
```
|
||||
解析Excel → 验证数据 → 批量查询已存在记录 → 分类数据
|
||||
→ 批量删除已存在记录 → 批量插入新记录和更新记录
|
||||
```
|
||||
|
||||
**优化后流程:**
|
||||
```
|
||||
解析Excel → 验证数据 → 批量 INSERT ON DUPLICATE KEY UPDATE
|
||||
```
|
||||
|
||||
**简化关键点:**
|
||||
- 移除"批量查询已存在记录"步骤
|
||||
- 移除"分类新增/更新记录"步骤
|
||||
- 移除"批量删除已存在记录"步骤
|
||||
|
||||
---
|
||||
|
||||
## 二、SQL实现细节
|
||||
|
||||
### 2.1 个人中介批量导入SQL
|
||||
|
||||
**Mapper方法签名:**
|
||||
```java
|
||||
void importPersonBatch(@Param("list") List<CcdiBizIntermediary> list);
|
||||
```
|
||||
|
||||
**SQL实现 (CcdiBizIntermediaryMapper.xml):**
|
||||
```xml
|
||||
<insert id="importPersonBatch">
|
||||
INSERT INTO cdi_biz_intermediary (
|
||||
person_id, name, gender, phone, address,
|
||||
intermediary_type, data_source, created_by, updated_by
|
||||
) VALUES
|
||||
<foreach collection="list" item="item" separator=",">
|
||||
(
|
||||
#{item.personId}, #{item.name}, #{item.gender},
|
||||
#{item.phone}, #{item.address}, #{item.intermediaryType},
|
||||
#{item.dataSource}, #{item.createdBy}, #{item.updatedBy}
|
||||
)
|
||||
</foreach>
|
||||
ON DUPLICATE KEY UPDATE
|
||||
name = IF(#{item.name} IS NOT NULL AND #{item.name} != '', #{item.name}, name),
|
||||
gender = IF(#{item.gender} IS NOT NULL AND #{item.gender} != '', #{item.gender}, gender),
|
||||
phone = IF(#{item.phone} IS NOT NULL AND #{item.phone} != '', #{item.phone}, phone),
|
||||
address = IF(#{item.address} IS NOT NULL AND #{item.address} != '', #{item.address}, address),
|
||||
intermediary_type = IF(#{item.intermediaryType} IS NOT NULL AND #{item.intermediaryType} != '', #{item.intermediaryType}, intermediary_type),
|
||||
update_time = NOW(),
|
||||
update_by = #{item.updatedBy}
|
||||
</insert>
|
||||
```
|
||||
|
||||
### 2.2 实体中介批量导入SQL
|
||||
|
||||
**Mapper方法签名:**
|
||||
```java
|
||||
void importEntityBatch(@Param("list") List<CcdiEnterpriseBaseInfo> list);
|
||||
```
|
||||
|
||||
**SQL实现 (CcdiEnterpriseBaseInfoMapper.xml):**
|
||||
```xml
|
||||
<insert id="importEntityBatch">
|
||||
INSERT INTO cdi_enterprise_base_info (
|
||||
social_credit_code, enterprise_name, legal_representative,
|
||||
phone, address, risk_level, ent_source, data_source,
|
||||
created_by, updated_by
|
||||
) VALUES
|
||||
<foreach collection="list" item="item" separator=",">
|
||||
(
|
||||
#{item.socialCreditCode}, #{item.enterpriseName},
|
||||
#{item.legalRepresentative}, #{item.phone}, #{item.address},
|
||||
#{item.riskLevel}, #{item.entSource}, #{item.dataSource},
|
||||
#{item.createdBy}, #{item.updatedBy}
|
||||
)
|
||||
</foreach>
|
||||
ON DUPLICATE KEY UPDATE
|
||||
enterprise_name = IF(#{item.enterpriseName} IS NOT NULL AND #{item.enterpriseName} != '', #{item.enterpriseName}, enterprise_name),
|
||||
legal_representative = IF(#{item.legalRepresentative} IS NOT NULL AND #{item.legalRepresentative} != '', #{item.legalRepresentative}, legal_representative),
|
||||
phone = IF(#{item.phone} IS NOT NULL AND #{item.phone} != '', #{item.phone}, phone),
|
||||
address = IF(#{item.address} IS NOT NULL AND #{item.address} != '', #{item.address}, address),
|
||||
update_time = NOW(),
|
||||
update_by = #{item.updatedBy}
|
||||
</insert>
|
||||
```
|
||||
|
||||
### 2.3 关键设计要点
|
||||
|
||||
**1. 非空字段更新策略:**
|
||||
```sql
|
||||
field = IF(#{item.field} IS NOT NULL AND #{item.field} != '', #{item.field}, field)
|
||||
```
|
||||
- 只更新Excel中非空的字段
|
||||
- 保留数据库中的原有值
|
||||
- 避免误清空数据
|
||||
|
||||
**2. 审计字段处理:**
|
||||
| 字段 | INSERT时 | UPDATE时 |
|
||||
|------|----------|----------|
|
||||
| created_by | 设置当前用户 | 不更新 |
|
||||
| create_time | 数据库默认NOW() | 不更新 |
|
||||
| updated_by | NULL | 设置当前用户 |
|
||||
| update_time | 数据库默认NOW() | 更新为NOW() |
|
||||
|
||||
**3. 唯一键约束:**
|
||||
- 个人中介: `person_id` (证件号)
|
||||
- 实体中介: `social_credit_code` (统一社会信用代码)
|
||||
|
||||
**4. 批量操作优化:**
|
||||
- 每批最多500条记录
|
||||
- 避免SQL过长导致性能问题
|
||||
- 超过500条时分批处理
|
||||
|
||||
---
|
||||
|
||||
## 三、Service层实现
|
||||
|
||||
### 3.1 isUpdateSupport参数处理
|
||||
|
||||
采用**方案C: Service层预处理**
|
||||
|
||||
```java
|
||||
@Override
|
||||
@Async
|
||||
@Transactional(rollbackFor = Exception.class)
|
||||
public void importPersonAsync(List<CcdiIntermediaryPersonExcel> excelList,
|
||||
Boolean isUpdateSupport,
|
||||
String taskId,
|
||||
String userName) {
|
||||
List<CcdiBizIntermediary> validRecords = new ArrayList<>();
|
||||
List<IntermediaryPersonImportFailureVO> failures = new ArrayList<>();
|
||||
|
||||
// 1. 数据验证阶段
|
||||
for (CcdiIntermediaryPersonExcel excel : excelList) {
|
||||
try {
|
||||
validatePersonData(excel);
|
||||
CcdiBizIntermediary intermediary = new CcdiBizIntermediary();
|
||||
BeanUtils.copyProperties(excel, intermediary);
|
||||
intermediary.setDataSource("IMPORT");
|
||||
intermediary.setCreatedBy(userName);
|
||||
if (isUpdateSupport) {
|
||||
intermediary.setUpdatedBy(userName);
|
||||
}
|
||||
validRecords.add(intermediary);
|
||||
} catch (Exception e) {
|
||||
IntermediaryPersonImportFailureVO failure = new IntermediaryPersonImportFailureVO();
|
||||
BeanUtils.copyProperties(excel, failure);
|
||||
failure.setErrorMessage(e.getMessage());
|
||||
failures.add(failure);
|
||||
}
|
||||
}
|
||||
|
||||
// 2. 根据isUpdateSupport选择处理方式
|
||||
if (isUpdateSupport) {
|
||||
// 更新模式:直接批量导入,数据库自动处理INSERT或UPDATE
|
||||
importBatchWithUpdateSupport(validRecords, 500);
|
||||
} else {
|
||||
// 仅新增模式:先查询已存在的记录,对冲突的抛出异常
|
||||
Set<String> existingIds = getExistingPersonIds(validRecords);
|
||||
for (CcdiBizIntermediary record : validRecords) {
|
||||
if (existingIds.contains(record.getPersonId())) {
|
||||
throw new RuntimeException("该证件号已存在");
|
||||
}
|
||||
}
|
||||
// 确认无冲突后,批量插入
|
||||
importBatchWithoutUpdateSupport(validRecords, 500);
|
||||
}
|
||||
|
||||
// 3. 更新导入状态
|
||||
ImportResult result = new ImportResult();
|
||||
result.setTotalCount(excelList.size());
|
||||
result.setSuccessCount(validRecords.size());
|
||||
result.setFailureCount(failures.size());
|
||||
|
||||
String finalStatus = result.getFailureCount() == 0 ? "SUCCESS" : "PARTIAL_SUCCESS";
|
||||
updateImportStatus(taskId, finalStatus, result);
|
||||
}
|
||||
```
|
||||
|
||||
### 3.2 代码简化对比
|
||||
|
||||
**优化前 (约120行):**
|
||||
```java
|
||||
// 1. 批量查询已存在记录
|
||||
Set<String> existingIds = getExistingPersonIds(excelList);
|
||||
|
||||
// 2. 分类数据
|
||||
for (excel : excelList) {
|
||||
if (existingIds.contains(excel.getPersonId())) {
|
||||
if (isUpdateSupport) {
|
||||
updateRecords.add(convert(excel));
|
||||
} else {
|
||||
throw new RuntimeException("已存在");
|
||||
}
|
||||
} else {
|
||||
newRecords.add(convert(excel));
|
||||
}
|
||||
}
|
||||
|
||||
// 3. 批量插入新数据
|
||||
if (!newRecords.isEmpty()) {
|
||||
saveBatch(newRecords, 500);
|
||||
}
|
||||
|
||||
// 4. 批量更新已有数据(先删除再插入)
|
||||
if (!updateRecords.isEmpty() && isUpdateSupport) {
|
||||
List<String> personIds = updateRecords.stream()
|
||||
.map(CcdiBizIntermediary::getPersonId)
|
||||
.collect(Collectors.toList());
|
||||
|
||||
LambdaQueryWrapper<CcdiBizIntermediary> deleteWrapper = new LambdaQueryWrapper<>();
|
||||
deleteWrapper.in(CcdiBizIntermediary::getPersonId, personIds);
|
||||
intermediaryMapper.delete(deleteWrapper);
|
||||
|
||||
intermediaryMapper.insertBatch(updateRecords);
|
||||
}
|
||||
```
|
||||
|
||||
**优化后 (约60行):**
|
||||
```java
|
||||
// 1. 验证数据并转换
|
||||
for (excel : excelList) {
|
||||
validatePersonData(excel);
|
||||
validRecords.add(convert(excel));
|
||||
}
|
||||
|
||||
// 2. 直接批量导入(数据库自动处理INSERT或UPDATE)
|
||||
if (isUpdateSupport) {
|
||||
intermediaryMapper.importPersonBatch(validRecords);
|
||||
} else {
|
||||
// 仅新增模式:检查唯一性
|
||||
checkUniqueAndInsert(validRecords);
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 四、错误处理与边界情况
|
||||
|
||||
### 4.1 错误分类处理
|
||||
|
||||
| 错误类型 | 处理方式 | 状态标记 |
|
||||
|----------|----------|----------|
|
||||
| 数据验证错误 | 添加到失败列表,继续处理后续数据 | PARTIAL_SUCCESS |
|
||||
| 唯一性冲突 (isUpdateSupport=false) | 抛出异常,添加到失败列表 | PARTIAL_SUCCESS |
|
||||
| SQL执行错误 | 事务回滚,记录详细错误信息 | FAILED |
|
||||
| 所有记录失败 | 状态为FAILED | FAILED |
|
||||
| 部分成功 | 状态为PARTIAL_SUCCESS | PARTIAL_SUCCESS |
|
||||
|
||||
### 4.2 边界情况处理
|
||||
|
||||
| 场景 | 处理方式 |
|
||||
|------|----------|
|
||||
| Excel为空 | 返回"至少需要一条数据" |
|
||||
| 所有数据格式错误 | 成功数=0,失败数=总数,状态=FAILED |
|
||||
| 超大数据量(>5000条) | 分批处理,每批500条 |
|
||||
| 并发导入相同数据 | 依靠数据库唯一索引保证一致性 |
|
||||
| NULL字段更新 | 使用IF语句跳过,保留原值 |
|
||||
| 空字符串字段更新 | 视为NULL,不更新 |
|
||||
|
||||
### 4.3 事务处理
|
||||
|
||||
```java
|
||||
@Async
|
||||
@Transactional(rollbackFor = Exception.class)
|
||||
public void importPersonAsync(...) {
|
||||
try {
|
||||
// 数据验证
|
||||
// 批量导入
|
||||
// 更新状态
|
||||
} catch (Exception e) {
|
||||
// 事务自动回滚
|
||||
// 记录错误日志
|
||||
// 更新状态为FAILED
|
||||
throw e;
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 五、测试策略
|
||||
|
||||
### 5.1 单元测试
|
||||
|
||||
**Mapper层测试:**
|
||||
- ✅ 批量插入全新记录
|
||||
- ✅ 批量更新已存在记录
|
||||
- ✅ 混合场景(部分新记录+部分已存在)
|
||||
- ✅ NULL值字段不覆盖原值
|
||||
- ✅ 审计字段正确设置和更新
|
||||
- ✅ 唯一键冲突处理
|
||||
|
||||
**Service层测试:**
|
||||
- ✅ `isUpdateSupport=true` 的完整流程
|
||||
- ✅ `isUpdateSupport=false` 时重复数据抛异常
|
||||
- ✅ 数据验证逻辑(必填字段、格式校验)
|
||||
- ✅ 事务回滚机制
|
||||
- ✅ 失败记录保存到Redis
|
||||
|
||||
### 5.2 集成测试场景
|
||||
|
||||
| 测试场景 | 测试步骤 | 预期结果 |
|
||||
|----------|----------|----------|
|
||||
| 新增模式测试 | 导入100条全新记录 | 全部成功插入,审计字段正确 |
|
||||
| 更新模式测试 | 导入→修改→再导入 | 数据正确更新,NULL字段保留 |
|
||||
| 混合模式测试 | 50新+50已存在记录 | 新记录插入,旧记录更新 |
|
||||
| 仅新增冲突测试 | 导入已存在记录(isUpdateSupport=false) | 抛出异常,记录失败 |
|
||||
| 空文件测试 | 导入空Excel | 返回"至少需要一条数据" |
|
||||
| 全部失败测试 | 所有数据格式错误 | 状态=FAILED,失败数=总数 |
|
||||
| 大数据量测试 | 导入2000+条记录 | 分批处理,全部成功 |
|
||||
| 并发测试 | 同时导入相同数据 | 依靠唯一索引保证一致性 |
|
||||
|
||||
### 5.3 性能测试
|
||||
|
||||
**测试数据:**
|
||||
- 500条记录
|
||||
- 1000条记录
|
||||
- 2000条记录
|
||||
|
||||
**性能指标:**
|
||||
- 总响应时间
|
||||
- 数据库操作次数
|
||||
- 内存使用情况
|
||||
|
||||
**预期性能提升:**
|
||||
- 更新模式下性能提升 30-40%
|
||||
- 数据库操作次数减少 2次(查询+删除)
|
||||
|
||||
---
|
||||
|
||||
## 六、实施计划
|
||||
|
||||
### 6.1 实施步骤
|
||||
|
||||
1. **数据库准备**
|
||||
- 确认 `cdi_biz_intermediary.person_id` 有唯一索引
|
||||
- 确认 `cdi_enterprise_base_info.social_credit_code` 有唯一索引
|
||||
|
||||
2. **Mapper层实现**
|
||||
- 在 `CcdiBizIntermediaryMapper` 接口添加 `importPersonBatch` 方法
|
||||
- 在 `CcdiEnterpriseBaseInfoMapper` 接口添加 `importEntityBatch` 方法
|
||||
- 在对应的XML文件实现SQL语句
|
||||
|
||||
3. **Service层重构**
|
||||
- 修改 `CcdiIntermediaryPersonImportServiceImpl.importPersonAsync` 方法
|
||||
- 修改 `CcdiIntermediaryEntityImportServiceImpl.importEntityAsync` 方法
|
||||
- 简化逻辑,移除删除操作
|
||||
|
||||
4. **单元测试**
|
||||
- 编写Mapper层测试
|
||||
- 编写Service层测试
|
||||
|
||||
5. **集成测试**
|
||||
- 使用现有测试数据验证功能
|
||||
- 对比优化前后的性能
|
||||
|
||||
6. **文档更新**
|
||||
- 更新API文档
|
||||
- 记录性能优化结果
|
||||
|
||||
### 6.2 向后兼容性
|
||||
|
||||
- ✅ API接口保持不变,前端无需修改
|
||||
- ✅ 返回数据格式不变
|
||||
- ✅ 错误处理机制不变
|
||||
- ✅ Redis状态管理不变
|
||||
|
||||
### 6.3 风险评估
|
||||
|
||||
| 风险 | 影响 | 缓解措施 |
|
||||
|------|------|----------|
|
||||
| 唯一索引缺失 | 功能失败 | 实施前检查索引存在性 |
|
||||
| 数据库版本兼容性 | SQL语法不支持 | 确认MySQL 5.7+ |
|
||||
| 并发冲突 | 数据不一致 | 依赖数据库唯一索引和事务 |
|
||||
| 性能回退 | 响应变慢 | 进行性能测试对比 |
|
||||
|
||||
---
|
||||
|
||||
## 七、预期收益
|
||||
|
||||
### 7.1 性能提升
|
||||
|
||||
| 指标 | 优化前 | 优化后 | 提升 |
|
||||
|------|--------|--------|------|
|
||||
| 数据库操作次数 | 3次(查询+删除+插入) | 1次(UPSERT) | -66% |
|
||||
| 代码行数 | ~120行 | ~60行 | -50% |
|
||||
| 响应时间(1000条更新) | 基准 | 减少30-40% | 30-40% |
|
||||
|
||||
### 7.2 代码质量
|
||||
|
||||
- ✅ 逻辑更清晰,易于维护
|
||||
- ✅ 减少出错可能性
|
||||
- ✅ 更好的事务一致性
|
||||
- ✅ 符合数据库最佳实践
|
||||
|
||||
### 7.3 可维护性
|
||||
|
||||
- SQL集中在XML文件,易于优化
|
||||
- 业务逻辑简化,降低认知负担
|
||||
- 错误处理更精确
|
||||
- 测试覆盖更全面
|
||||
|
||||
---
|
||||
|
||||
## 八、附录
|
||||
|
||||
### 8.1 相关文件
|
||||
|
||||
- Controller: `CcdiIntermediaryController.java`
|
||||
- Service接口: `ICcdiIntermediaryService.java`
|
||||
- Service实现: `CcdiIntermediaryServiceImpl.java`
|
||||
- Import Service: `CcdiIntermediaryPersonImportServiceImpl.java`
|
||||
- Mapper接口: `CcdiBizIntermediaryMapper.java`
|
||||
- Mapper XML: `CcdiBizIntermediaryMapper.xml`
|
||||
|
||||
### 8.2 数据库表结构
|
||||
|
||||
**个人中介表 (cdi_biz_intermediary):**
|
||||
```sql
|
||||
UNIQUE KEY `uk_person_id` (`person_id`)
|
||||
```
|
||||
|
||||
**实体中介表 (cdi_enterprise_base_info):**
|
||||
```sql
|
||||
PRIMARY KEY (`social_credit_code`)
|
||||
```
|
||||
|
||||
### 8.3 测试数据
|
||||
|
||||
- 测试文件: `doc/test-data/purchase_transaction/purchase_test_data_2000_final.xlsx`
|
||||
- 测试脚本: 待生成
|
||||
|
||||
---
|
||||
|
||||
**文档版本**: 1.0
|
||||
**最后更新**: 2026-02-08
|
||||
**状态**: 待评审
|
||||
209
doc/plans/2026-02-08-purchase-transaction-import-fixes.md
Normal file
209
doc/plans/2026-02-08-purchase-transaction-import-fixes.md
Normal file
@@ -0,0 +1,209 @@
|
||||
# 采购交易导入功能问题修复总结
|
||||
|
||||
## 修复日期
|
||||
2026-02-08
|
||||
|
||||
## 问题描述
|
||||
|
||||
### 问题1: 导入全部失败,提示"采购数量不能为空"
|
||||
**现象**:
|
||||
- 使用 `purchase_test_data_2000.xlsx` 导入,2000条记录全部失败
|
||||
- 错误信息:`采购数量不能为空`
|
||||
- 查询失败记录接口返回2000条记录
|
||||
|
||||
**根本原因**:
|
||||
- Excel实体类 `CcdiPurchaseTransactionExcel` 中,数值字段(purchaseQty、budgetAmount等)类型为 **String**
|
||||
- AddDTO `CcdiPurchaseTransactionAddDTO` 中,对应字段类型为 **BigDecimal**
|
||||
- `BeanUtils.copyProperties()` 进行 String → BigDecimal 转换时,空字符串转换为null
|
||||
- 测试数据中这些列为空,导致验证失败
|
||||
|
||||
**修复方案**:
|
||||
修改 `CcdiPurchaseTransactionExcel.java`,将数值字段类型从 String 改为 BigDecimal
|
||||
|
||||
**修改文件**:
|
||||
- `ruoyi-ccdi/src/main/java/com/ruoyi/ccdi/domain/excel/CcdiPurchaseTransactionExcel.java:52-82`
|
||||
|
||||
**修改内容**:
|
||||
```java
|
||||
// 修改前
|
||||
private String purchaseQty;
|
||||
private String budgetAmount;
|
||||
private String bidAmount;
|
||||
// ... 其他金额字段
|
||||
|
||||
// 修改后
|
||||
private BigDecimal purchaseQty;
|
||||
private BigDecimal budgetAmount;
|
||||
private BigDecimal bidAmount;
|
||||
// ... 其他金额字段
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### 问题2: 查看失败记录弹窗显示"暂无数据"
|
||||
**现象**:
|
||||
- 导入失败后,点击"查看导入失败记录"按钮
|
||||
- 后端接口返回了失败记录数据
|
||||
- 前端页面显示"暂无数据"
|
||||
|
||||
**根本原因**:
|
||||
- 前端期望接口返回分页格式:`{rows: [...], total: N}`
|
||||
- 后端接口返回的是简单列表:`{data: [...]}`
|
||||
- 后端接口缺少分页参数和分页逻辑
|
||||
|
||||
**修复方案**:
|
||||
参照员工信息管理模块,修改采购交易管理的查询失败记录接口:
|
||||
1. 添加分页参数(pageNum、pageSize)
|
||||
2. 实现手动分页逻辑
|
||||
3. 返回类型从 `AjaxResult` 改为 `TableDataInfo`
|
||||
4. 使用 `getDataTable()` 方法返回分页格式
|
||||
|
||||
**修改文件**:
|
||||
- `ruoyi-ccdi/src/main/java/com/ruoyi/ccdi/controller/CcdiPurchaseTransactionController.java:173-196`
|
||||
|
||||
**修改内容**:
|
||||
```java
|
||||
// 修改前
|
||||
@GetMapping("/importFailures/{taskId}")
|
||||
public AjaxResult getImportFailures(@PathVariable String taskId) {
|
||||
List<PurchaseTransactionImportFailureVO> failures = transactionImportService.getImportFailures(taskId);
|
||||
return success(failures); // 返回 {data: [...]}
|
||||
}
|
||||
|
||||
// 修改后
|
||||
@GetMapping("/importFailures/{taskId}")
|
||||
public TableDataInfo getImportFailures(
|
||||
@PathVariable String taskId,
|
||||
@RequestParam(defaultValue = "1") Integer pageNum,
|
||||
@RequestParam(defaultValue = "10") Integer pageSize) {
|
||||
|
||||
List<PurchaseTransactionImportFailureVO> failures = transactionImportService.getImportFailures(taskId);
|
||||
|
||||
// 手动分页
|
||||
int fromIndex = (pageNum - 1) * pageSize;
|
||||
int toIndex = Math.min(fromIndex + pageSize, failures.size());
|
||||
List<PurchaseTransactionImportFailureVO> pageData = failures.subList(fromIndex, toIndex);
|
||||
|
||||
return getDataTable(pageData, failures.size()); // 返回 {rows: [...], total: N}
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 修复后的完整流程
|
||||
|
||||
### 1. 正常导入场景(数据完整)
|
||||
1. 上传Excel文件
|
||||
2. 后端异步处理,验证数据
|
||||
3. 所有数据通过验证,成功插入数据库
|
||||
4. 前端收到完成通知:`导入完成!全部成功!共导入2000条数据`
|
||||
5. 列表刷新,显示新导入的数据
|
||||
|
||||
### 2. 部分失败场景(数据有误)
|
||||
1. 上传Excel文件
|
||||
2. 后端异步处理,验证数据
|
||||
3. 部分数据验证失败,失败记录保存到Redis
|
||||
4. 前端收到完成通知:`导入完成!成功1800条,失败200条`
|
||||
5. 操作栏显示"查看导入失败记录"按钮
|
||||
6. 点击按钮,打开失败记录对话框
|
||||
7. 对话框显示分页的失败记录列表:
|
||||
- 采购事项ID
|
||||
- 项目名称
|
||||
- 标的物名称
|
||||
- 失败原因
|
||||
8. 支持分页查询(每页10条)
|
||||
9. 支持清除历史记录
|
||||
|
||||
---
|
||||
|
||||
## 测试建议
|
||||
|
||||
### 1. 测试正常导入
|
||||
- 使用修复后的测试数据:`purchase_test_data_2000_fixed.xlsx`
|
||||
- 预期结果:全部成功,2000条记录导入成功
|
||||
|
||||
### 2. 测试失败记录查看
|
||||
- 使用有问题的测试数据(故意制造错误数据)
|
||||
- 预期结果:
|
||||
- 显示部分成功通知
|
||||
- 显示"查看导入失败记录"按钮
|
||||
- 点击后能看到失败记录列表
|
||||
- 分页功能正常
|
||||
|
||||
### 3. 测试状态持久化
|
||||
- 导入有失败的数据
|
||||
- 刷新页面
|
||||
- 预期结果:"查看导入失败记录"按钮仍然显示
|
||||
|
||||
---
|
||||
|
||||
## 修复验证清单
|
||||
|
||||
- [x] 修改Excel实体类字段类型
|
||||
- [x] 重新编译后端成功
|
||||
- [x] 修改查询失败记录接口
|
||||
- [x] 添加分页支持
|
||||
- [x] 重新编译后端成功
|
||||
- [ ] 重启后端服务
|
||||
- [ ] 测试正常导入
|
||||
- [ ] 测试失败记录查看
|
||||
- [ ] 验证前端显示正常
|
||||
|
||||
---
|
||||
|
||||
## 下一步操作
|
||||
|
||||
**需要手动执行**:
|
||||
1. 重启后端服务(加载新编译的代码)
|
||||
2. 使用修复后的测试数据进行导入测试
|
||||
3. 验证失败记录查看功能正常
|
||||
|
||||
---
|
||||
|
||||
## 技术说明
|
||||
|
||||
### Excel数值字段处理
|
||||
- **EasyExcel** 会根据Java字段类型自动转换
|
||||
- String类型 → 读取为字符串(空值可能为空字符串)
|
||||
- BigDecimal类型 → 读取为数值(空值为null)
|
||||
- BeanUtils.copyProperties() 会自动处理类型转换
|
||||
|
||||
### 分页数据格式
|
||||
```javascript
|
||||
// 前端期望的格式
|
||||
{
|
||||
"code": 200,
|
||||
"msg": "查询成功",
|
||||
"rows": [...], // 当前页数据
|
||||
"total": 100 // 总记录数
|
||||
}
|
||||
```
|
||||
|
||||
### 若依框架分页方法
|
||||
```java
|
||||
// BaseController.getDataTable() 方法
|
||||
protected TableDataInfo getDataTable(List<?> list, long total) {
|
||||
TableDataInfo rspData = new TableDataInfo();
|
||||
rspData.setCode(200);
|
||||
rspData.setMsg("查询成功");
|
||||
rspData.setRows(list);
|
||||
rspData.setTotal(total);
|
||||
return rspData;
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 附录:相关文件
|
||||
|
||||
### 修改的文件
|
||||
1. `ruoyi-ccdi/src/main/java/com/ruoyi/ccdi/domain/excel/CcdiPurchaseTransactionExcel.java`
|
||||
2. `ruoyi-ccdi/src/main/java/com/ruoyi/ccdi/controller/CcdiPurchaseTransactionController.java`
|
||||
|
||||
### 参考文件
|
||||
1. `ruoyi-ccdi/src/main/java/com/ruoyi/ccdi/controller/CcdiEmployeeController.java` (员工信息管理,作为参考)
|
||||
|
||||
### 测试文件
|
||||
1. `doc/test-data/purchase_transaction/generate-test-data.js` (测试数据生成脚本)
|
||||
2. `doc/test-data/purchase_transaction/purchase_test_data_2000_fixed.xlsx` (修复后的测试数据)
|
||||
3. `doc/test-data/purchase_transaction/test-import-debug.js` (导入测试脚本)
|
||||
324
doc/test-checklist-intermediary-import-failure-view.md
Normal file
324
doc/test-checklist-intermediary-import-failure-view.md
Normal file
@@ -0,0 +1,324 @@
|
||||
# 中介库导入失败记录查看功能 - 测试清单
|
||||
|
||||
## 测试环境
|
||||
- 前端: Vue 2.6.12 + Element UI
|
||||
- 后端: Spring Boot 3.5.8
|
||||
- 测试数据目录: `doc/test-data/purchase_transaction/`
|
||||
|
||||
## 测试前准备
|
||||
|
||||
### 1. 准备测试数据
|
||||
准备包含错误数据的Excel文件,用于测试导入失败场景:
|
||||
|
||||
**个人中介测试数据应包含的错误类型:**
|
||||
- 缺少必填字段(姓名、证件号)
|
||||
- 证件号格式错误
|
||||
- 手机号格式错误
|
||||
- 重复数据(唯一键冲突)
|
||||
|
||||
**实体中介测试数据应包含的错误类型:**
|
||||
- 缺少必填字段(机构名称、统一社会信用代码)
|
||||
- 统一社会信用代码格式错误
|
||||
- 重复数据(唯一键冲突)
|
||||
|
||||
### 2. 清理环境
|
||||
打开浏览器开发者工具 → Application → Local Storage,清除以下key:
|
||||
- `intermediary_person_import_last_task`
|
||||
- `intermediary_entity_import_last_task`
|
||||
|
||||
## 功能测试清单
|
||||
|
||||
### 测试1: 个人中介导入失败记录查看
|
||||
|
||||
#### 步骤
|
||||
1. 访问中介库管理页面
|
||||
2. 点击"导入"按钮
|
||||
3. 选择"个人中介"导入类型
|
||||
4. 上传包含错误数据的个人中介Excel文件
|
||||
5. 等待导入完成(观察通知消息)
|
||||
6. 验证"查看个人导入失败记录"按钮是否显示
|
||||
7. 点击按钮查看失败记录
|
||||
|
||||
#### 预期结果
|
||||
- ✅ 导入完成后显示通知:"成功X条,失败Y条"
|
||||
- ✅ 工具栏显示"查看个人导入失败记录"按钮(黄色警告样式)
|
||||
- ✅ 按钮tooltip显示上次导入时间
|
||||
- ✅ 点击按钮打开对话框
|
||||
- ✅ 对话框标题:"个人中介导入失败记录"
|
||||
- ✅ 顶部显示统计信息:"导入时间: XXX | 总数: X条 | 成功: X条 | 失败: X条"
|
||||
- ✅ 表格显示失败记录,包含以下列:
|
||||
- 姓名
|
||||
- 证件号码
|
||||
- 人员类型
|
||||
- 性别
|
||||
- 手机号码
|
||||
- 所在公司
|
||||
- **失败原因**(最小宽度200px,溢出显示tooltip)
|
||||
- ✅ 如果失败记录超过10条,分页组件正常显示
|
||||
|
||||
### 测试2: 实体中介导入失败记录查看
|
||||
|
||||
#### 步骤
|
||||
1. 访问中介库管理页面
|
||||
2. 点击"导入"按钮
|
||||
3. 选择"实体中介"导入类型
|
||||
4. 上传包含错误数据的实体中介Excel文件
|
||||
5. 等待导入完成(观察通知消息)
|
||||
6. 验证"查看实体导入失败记录"按钮是否显示
|
||||
7. 点击按钮查看失败记录
|
||||
|
||||
#### 预期结果
|
||||
- ✅ 导入完成后显示通知:"成功X条,失败Y条"
|
||||
- ✅ 工具栏显示"查看实体导入失败记录"按钮(黄色警告样式)
|
||||
- ✅ 按钮tooltip显示上次导入时间
|
||||
- ✅ 点击按钮打开对话框
|
||||
- ✅ 对话框标题:"实体中介导入失败记录"
|
||||
- ✅ 顶部显示统计信息:"导入时间: XXX | 总数: X条 | 成功: X条 | 失败: X条"
|
||||
- ✅ 表格显示失败记录,包含以下列:
|
||||
- 机构名称
|
||||
- 统一社会信用代码
|
||||
- 主体类型
|
||||
- 企业性质
|
||||
- 法定代表人
|
||||
- 成立日期(格式: YYYY-MM-DD)
|
||||
- **失败原因**(最小宽度200px,溢出显示tooltip)
|
||||
- ✅ 如果失败记录超过10条,分页组件正常显示
|
||||
|
||||
### 测试3: localStorage持久化
|
||||
|
||||
#### 步骤
|
||||
1. 执行个人中介导入,包含失败记录
|
||||
2. 观察按钮显示
|
||||
3. 刷新页面(F5)
|
||||
4. 观察"查看个人导入失败记录"按钮是否仍然显示
|
||||
5. 点击按钮验证能否正常查看失败记录
|
||||
|
||||
#### 预期结果
|
||||
- ✅ 刷新页面后按钮仍然显示
|
||||
- ✅ 点击按钮能正常查看失败记录
|
||||
- ✅ localStorage中存在`intermediary_person_import_last_task`或`intermediary_entity_import_last_task`
|
||||
|
||||
### 测试4: 分页功能
|
||||
|
||||
#### 步骤
|
||||
1. 准备至少20条失败记录的数据
|
||||
2. 导入并等待完成
|
||||
3. 打开失败记录对话框
|
||||
4. 测试翻页功能
|
||||
|
||||
#### 预期结果
|
||||
- ✅ 分页组件显示正确的总记录数
|
||||
- ✅ 每页显示10条记录
|
||||
- ✅ 点击下一页/上一页按钮正常切换
|
||||
- ✅ 修改每页显示数量正常工作
|
||||
|
||||
### 测试5: 清除历史记录
|
||||
|
||||
#### 步骤
|
||||
1. 打开失败记录对话框
|
||||
2. 点击"清除历史记录"按钮
|
||||
3. 确认清除操作
|
||||
4. 关闭对话框
|
||||
5. 观察工具栏按钮是否隐藏
|
||||
6. 检查localStorage是否已清除
|
||||
|
||||
#### 预期结果
|
||||
- ✅ 弹出确认对话框:"确认清除上次导入记录?"
|
||||
- ✅ 确认后显示成功提示:"已清除"
|
||||
- ✅ 对话框关闭
|
||||
- ✅ 工具栏对应的"查看失败记录"按钮隐藏
|
||||
- ✅ localStorage中的对应key已删除
|
||||
|
||||
### 测试6: 记录过期处理
|
||||
|
||||
#### 方法1: 手动修改localStorage模拟过期
|
||||
1. 打开开发者工具 → Application → Local Storage
|
||||
2. 找到`intermediary_person_import_last_task`或`intermediary_entity_import_last_task`
|
||||
3. 修改`saveTime`为8天前的时间戳
|
||||
4. 刷新页面
|
||||
5. 观察按钮是否隐藏
|
||||
|
||||
#### 方法2: 等待后端记录过期
|
||||
1. 导入数据并等待失败记录显示
|
||||
2. 等待后端清理过期记录(根据后端配置的过期时间)
|
||||
3. 点击"查看失败记录"按钮
|
||||
4. 观察错误提示
|
||||
|
||||
#### 预期结果
|
||||
- ✅ 方法1: 刷新后按钮自动隐藏
|
||||
- ✅ 方法2: 显示提示"导入记录已过期,无法查看失败记录"
|
||||
- ✅ 方法2: localStorage自动清除
|
||||
- ✅ 方法2: 按钮自动隐藏
|
||||
|
||||
### 测试7: 两种类型导入互不影响
|
||||
|
||||
#### 步骤
|
||||
1. 先导入个人中介(有失败记录)
|
||||
2. 再导入实体中介(有失败记录)
|
||||
3. 验证两个按钮是否同时显示
|
||||
4. 分别点击两个按钮,验证显示的失败记录是否正确
|
||||
|
||||
#### 预期结果
|
||||
- ✅ 两个按钮同时显示
|
||||
- ✅ "查看个人导入失败记录"按钮显示个人中介的失败记录
|
||||
- ✅ "查看实体导入失败记录"按钮显示实体中介的失败记录
|
||||
- ✅ 两个localStorage存储独立,互不影响
|
||||
|
||||
### 测试8: 导入成功场景
|
||||
|
||||
#### 步骤
|
||||
1. 准备完全正确的Excel文件(所有数据都符合要求)
|
||||
2. 导入数据
|
||||
3. 等待导入完成
|
||||
|
||||
#### 预期结果
|
||||
- ✅ 显示成功通知:"全部成功!共导入X条数据"
|
||||
- ✅ 不显示"查看失败记录"按钮
|
||||
- ✅ localStorage中不存储该任务(或hasFailures为false)
|
||||
|
||||
### 测试9: 网络错误处理
|
||||
|
||||
#### 步骤
|
||||
1. 导入数据(有失败记录)
|
||||
2. 打开失败记录对话框
|
||||
3. 断开网络或使用浏览器开发者工具模拟离线
|
||||
4. 尝试翻页或重新加载失败记录
|
||||
|
||||
#### 预期结果
|
||||
- ✅ 显示友好的错误提示:"网络连接失败,请检查网络"
|
||||
- ✅ 不影响页面其他功能的正常使用
|
||||
|
||||
### 测试10: 服务器错误处理
|
||||
|
||||
#### 步骤
|
||||
1. 导入数据(有失败记录)
|
||||
2. 使用浏览器开发者工具模拟服务器错误(500)
|
||||
3. 尝试加载失败记录
|
||||
|
||||
#### 预期结果
|
||||
- ✅ 显示错误提示:"服务器错误,请稍后重试"
|
||||
|
||||
## 边界情况测试
|
||||
|
||||
### 测试11: 大数据量性能测试
|
||||
|
||||
#### 步骤
|
||||
1. 准备1000条数据,其中100条失败
|
||||
2. 导入并等待完成
|
||||
3. 打开失败记录对话框
|
||||
4. 测试翻页性能
|
||||
|
||||
#### 预期结果
|
||||
- ✅ 导入在合理时间内完成(参考员工模块:1000条约1-2分钟)
|
||||
- ✅ 查询失败记录响应时间 < 2秒
|
||||
- ✅ 翻页流畅,无卡顿
|
||||
|
||||
### 测试12: 并发导入
|
||||
|
||||
#### 步骤
|
||||
1. 快速连续执行两次个人中介导入
|
||||
2. 观察localStorage中的数据
|
||||
3. 观察按钮显示状态
|
||||
|
||||
#### 预期结果
|
||||
- ✅ 只有最近一次导入的数据被保存
|
||||
- ✅ 按钮显示状态基于最新的导入结果
|
||||
|
||||
## 浏览器兼容性测试
|
||||
|
||||
### 测试13: 不同浏览器测试
|
||||
在以下浏览器中重复执行测试1和测试2:
|
||||
- ✅ Chrome (推荐)
|
||||
- ✅ Firefox
|
||||
- ✅ Edge
|
||||
- ✅ Safari (Mac)
|
||||
|
||||
## 回归测试
|
||||
|
||||
### 测试14: 原有功能不受影响
|
||||
验证以下原有功能仍正常工作:
|
||||
- ✅ 新增中介(个人/实体)
|
||||
- ✅ 编辑中介(个人/实体)
|
||||
- ✅ 查看详情
|
||||
- ✅ 删除中介
|
||||
- ✅ 搜索功能
|
||||
- ✅ 导入成功场景
|
||||
- ✅ 导入模板下载
|
||||
|
||||
## 性能测试
|
||||
|
||||
### 测试15: 内存泄漏检查
|
||||
1. 打开浏览器开发者工具 → Performance
|
||||
2. 开始录制
|
||||
3. 执行多次导入和查看失败记录操作
|
||||
4. 停止录制
|
||||
5. 检查内存使用情况
|
||||
|
||||
#### 预期结果
|
||||
- ✅ 内存使用稳定,无明显泄漏
|
||||
- ✅ 定时器在组件销毁时正确清理
|
||||
|
||||
## 自动化测试脚本(可选)
|
||||
|
||||
### 测试16: API接口测试
|
||||
使用Postman或curl测试以下接口:
|
||||
|
||||
```bash
|
||||
# 1. 测试个人中介导入失败记录查询
|
||||
curl -X GET "http://localhost:8080/ccdi/intermediary/importPersonFailures/{taskId}?pageNum=1&pageSize=10" \
|
||||
-H "Authorization: Bearer {token}"
|
||||
|
||||
# 2. 测试实体中介导入失败记录查询
|
||||
curl -X GET "http://localhost:8080/ccdi/intermediary/importEntityFailures/{taskId}?pageNum=1&pageSize=10" \
|
||||
-H "Authorization: Bearer {token}"
|
||||
|
||||
# 3. 测试过期记录查询(应返回404)
|
||||
curl -X GET "http://localhost:8080/ccdi/intermediary/importPersonFailures/expired-task-id?pageNum=1&pageSize=10" \
|
||||
-H "Authorization: Bearer {token}"
|
||||
```
|
||||
|
||||
## 测试结果记录表
|
||||
|
||||
| 测试项 | 测试结果 | 问题描述 | 解决方案 | 验证日期 |
|
||||
|--------|---------|---------|---------|---------|
|
||||
| 测试1: 个人中介导入失败记录查看 | ⬜ 通过 / ⬜ 失败 | | | |
|
||||
| 测试2: 实体中介导入失败记录查看 | ⬜ 通过 / ⬜ 失败 | | | |
|
||||
| 测试3: localStorage持久化 | ⬜ 通过 / ⬜ 失败 | | | |
|
||||
| 测试4: 分页功能 | ⬜ 通过 / ⬜ 失败 | | | |
|
||||
| 测试5: 清除历史记录 | ⬜ 通过 / ⬜ 失败 | | | |
|
||||
| 测试6: 记录过期处理 | ⬜ 通过 / ⬜ 失败 | | | |
|
||||
| 测试7: 两种类型导入互不影响 | ⬜ 通过 / ⬜ 失败 | | | |
|
||||
| 测试8: 导入成功场景 | ⬜ 通过 / ⬜ 失败 | | | |
|
||||
| 测试9: 网络错误处理 | ⬜ 通过 / ⬜ 失败 | | | |
|
||||
| 测试10: 服务器错误处理 | ⬜ 通过 / ⬜ 失败 | | | |
|
||||
| 测试11: 大数据量性能测试 | ⬜ 通过 / ⬜ 失败 | | | |
|
||||
| 测试12: 并发导入 | ⬜ 通过 / ⬜ 失败 | | | |
|
||||
| 测试13: 浏览器兼容性 | ⬜ 通过 / ⬜ 失败 | | | |
|
||||
| 测试14: 原有功能不受影响 | ⬜ 通过 / ⬜ 失败 | | | |
|
||||
| 测试15: 内存泄漏检查 | ⬜ 通过 / ⬜ 失败 | | | |
|
||||
|
||||
## 已知问题
|
||||
|
||||
记录测试过程中发现的已知问题:
|
||||
|
||||
| 问题编号 | 问题描述 | 严重程度 | 状态 | 解决方案 |
|
||||
|---------|---------|---------|------|---------|
|
||||
| | | | | |
|
||||
|
||||
## 测试总结
|
||||
|
||||
### 通过率统计
|
||||
- 总测试项: 15项
|
||||
- 通过: X项
|
||||
- 失败: Y项
|
||||
- 通过率: X%
|
||||
|
||||
### 测试结论
|
||||
- ⬜ 测试通过,可以发布
|
||||
- ⬜ 存在问题,需要修复后再测试
|
||||
|
||||
### 测试签名
|
||||
- 测试人员: ___________
|
||||
- 测试日期: ___________
|
||||
- 审核人员: ___________
|
||||
- 审核日期: ___________
|
||||
BIN
doc/test-data/intermediary/person_1770542031351.xlsx
Normal file
BIN
doc/test-data/intermediary/person_1770542031351.xlsx
Normal file
Binary file not shown.
Binary file not shown.
226
doc/test-data/purchase_transaction/generate-test-data.js
Normal file
226
doc/test-data/purchase_transaction/generate-test-data.js
Normal file
@@ -0,0 +1,226 @@
|
||||
const Excel = require('exceljs');
|
||||
|
||||
// 配置
|
||||
const OUTPUT_FILE = 'purchase_test_data_2000_v2.xlsx';
|
||||
const RECORD_COUNT = 2000;
|
||||
|
||||
// 数据池
|
||||
const PURCHASE_CATEGORIES = ['货物类', '工程类', '服务类', '软件系统', '办公设备', '家具用具', '专用设备', '通讯设备'];
|
||||
const PURCHASE_METHODS = ['公开招标', '邀请招标', '询价采购', '单一来源', '竞争性谈判'];
|
||||
const DEPARTMENTS = ['人事部', '行政部', '财务部', '技术部', '市场部', '采购部', '研发部'];
|
||||
const EMPLOYEES = [
|
||||
{ id: 'EMP0001', name: '张伟' },
|
||||
{ id: 'EMP0002', name: '王芳' },
|
||||
{ id: 'EMP0003', name: '李娜' },
|
||||
{ id: 'EMP0004', name: '刘洋' },
|
||||
{ id: 'EMP0005', name: '陈静' },
|
||||
{ id: 'EMP0006', name: '杨强' },
|
||||
{ id: 'EMP0007', name: '赵敏' },
|
||||
{ id: 'EMP0008', name: '孙杰' },
|
||||
{ id: 'EMP0009', name: '周涛' },
|
||||
{ id: 'EMP0010', name: '吴刚' },
|
||||
{ id: 'EMP0011', name: '郑丽' },
|
||||
{ id: 'EMP0012', name: '钱勇' },
|
||||
{ id: 'EMP0013', name: '何静' },
|
||||
{ id: 'EMP0014', name: '朱涛' },
|
||||
{ id: 'EMP0015', name: '马超' }
|
||||
];
|
||||
|
||||
// 生成随机整数
|
||||
function randomInt(min, max) {
|
||||
return Math.floor(Math.random() * (max - min + 1)) + min;
|
||||
}
|
||||
|
||||
// 生成随机浮点数
|
||||
function randomFloat(min, max, decimals = 2) {
|
||||
const num = Math.random() * (max - min) + min;
|
||||
return parseFloat(num.toFixed(decimals));
|
||||
}
|
||||
|
||||
// 从数组中随机选择
|
||||
function randomChoice(arr) {
|
||||
return arr[Math.floor(Math.random() * arr.length)];
|
||||
}
|
||||
|
||||
// 生成随机日期
|
||||
function randomDate(start, end) {
|
||||
return new Date(start.getTime() + Math.random() * (end.getTime() - start.getTime()));
|
||||
}
|
||||
|
||||
// 生成采购事项ID
|
||||
function generatePurchaseId(index) {
|
||||
const timestamp = Date.now();
|
||||
const num = String(index + 1).padStart(4, '0');
|
||||
return `PUR${timestamp}${num}`;
|
||||
}
|
||||
|
||||
// 生成测试数据
|
||||
function generateTestData(count) {
|
||||
const data = [];
|
||||
const startDate = new Date('2023-01-01');
|
||||
const endDate = new Date('2025-12-31');
|
||||
|
||||
for (let i = 0; i < count; i++) {
|
||||
const purchaseQty = randomFloat(1, 5000, 2);
|
||||
const unitPrice = randomFloat(100, 50000, 2);
|
||||
const budgetAmount = parseFloat((purchaseQty * unitPrice).toFixed(2));
|
||||
const discount = randomFloat(0.85, 0.98, 2);
|
||||
const actualAmount = parseFloat((budgetAmount * discount).toFixed(2));
|
||||
|
||||
const employee = randomChoice(EMPLOYEES);
|
||||
|
||||
// 生成Date对象
|
||||
const applyDateObj = randomDate(startDate, endDate);
|
||||
|
||||
// 生成后续日期(都比申请日期晚)
|
||||
const planApproveDate = new Date(applyDateObj);
|
||||
planApproveDate.setDate(planApproveDate.getDate() + randomInt(1, 7));
|
||||
|
||||
const announceDate = new Date(planApproveDate);
|
||||
announceDate.setDate(announceDate.getDate() + randomInt(3, 15));
|
||||
|
||||
const bidOpenDate = new Date(announceDate);
|
||||
bidOpenDate.setDate(bidOpenDate.getDate() + randomInt(5, 20));
|
||||
|
||||
const contractSignDate = new Date(bidOpenDate);
|
||||
contractSignDate.setDate(contractSignDate.getDate() + randomInt(3, 10));
|
||||
|
||||
const expectedDeliveryDate = new Date(contractSignDate);
|
||||
expectedDeliveryDate.setDate(expectedDeliveryDate.getDate() + randomInt(15, 60));
|
||||
|
||||
const actualDeliveryDate = new Date(expectedDeliveryDate);
|
||||
actualDeliveryDate.setDate(actualDeliveryDate.getDate() + randomInt(-2, 5));
|
||||
|
||||
const acceptanceDate = new Date(actualDeliveryDate);
|
||||
acceptanceDate.setDate(acceptanceDate.getDate() + randomInt(1, 7));
|
||||
|
||||
const settlementDate = new Date(acceptanceDate);
|
||||
settlementDate.setDate(settlementDate.getDate() + randomInt(7, 30));
|
||||
|
||||
data.push({
|
||||
purchaseId: generatePurchaseId(i),
|
||||
purchaseCategory: randomChoice(PURCHASE_CATEGORIES),
|
||||
projectName: `${randomChoice(PURCHASE_CATEGORIES)}采购项目-${String(i + 1).padStart(4, '0')}`,
|
||||
subjectName: `${randomChoice(PURCHASE_CATEGORIES).replace('类', '')}配件-${String(i + 1).padStart(4, '0')}`,
|
||||
subjectDesc: `${randomChoice(PURCHASE_CATEGORIES)}采购项目标的物详细描述-${String(i + 1).padStart(4, '0')}`,
|
||||
purchaseQty: purchaseQty,
|
||||
budgetAmount: budgetAmount,
|
||||
bidAmount: actualAmount,
|
||||
actualAmount: actualAmount,
|
||||
contractAmount: actualAmount,
|
||||
settlementAmount: actualAmount,
|
||||
purchaseMethod: randomChoice(PURCHASE_METHODS),
|
||||
supplierName: `供应商公司-${String(i + 1).padStart(4, '0')}有限公司`,
|
||||
contactPerson: `联系人-${String(i + 1).padStart(4, '0')}`,
|
||||
contactPhone: `13${randomInt(0, 9)}${String(randomInt(10000000, 99999999))}`,
|
||||
supplierUscc: `91${randomInt(10000000, 99999999)}MA${String(randomInt(1000, 9999))}`,
|
||||
supplierBankAccount: `6222${String(randomInt(100000000000000, 999999999999999))}`,
|
||||
applyDate: applyDateObj, // Date对象
|
||||
planApproveDate: planApproveDate,
|
||||
announceDate: announceDate,
|
||||
bidOpenDate: bidOpenDate,
|
||||
contractSignDate: contractSignDate,
|
||||
expectedDeliveryDate: expectedDeliveryDate,
|
||||
actualDeliveryDate: actualDeliveryDate,
|
||||
acceptanceDate: acceptanceDate,
|
||||
settlementDate: settlementDate,
|
||||
applicantId: employee.id,
|
||||
applicantName: employee.name,
|
||||
applyDepartment: randomChoice(DEPARTMENTS),
|
||||
purchaseLeaderId: randomChoice(EMPLOYEES).id,
|
||||
purchaseLeaderName: randomChoice(EMPLOYEES).name,
|
||||
purchaseDepartment: '采购部'
|
||||
});
|
||||
}
|
||||
|
||||
return data;
|
||||
}
|
||||
|
||||
// 创建Excel文件
|
||||
async function createExcelFile() {
|
||||
console.log('开始生成测试数据...');
|
||||
console.log(`记录数: ${RECORD_COUNT}`);
|
||||
|
||||
// 生成测试数据
|
||||
const testData = generateTestData(RECORD_COUNT);
|
||||
console.log('测试数据生成完成');
|
||||
|
||||
// 创建工作簿
|
||||
const workbook = new Excel.Workbook();
|
||||
const worksheet = workbook.addWorksheet('采购交易数据');
|
||||
|
||||
// 定义列(按照Excel实体类的index顺序)
|
||||
worksheet.columns = [
|
||||
{ header: '采购事项ID', key: 'purchaseId', width: 25 },
|
||||
{ header: '采购类别', key: 'purchaseCategory', width: 15 },
|
||||
{ header: '项目名称', key: 'projectName', width: 30 },
|
||||
{ header: '标的物名称', key: 'subjectName', width: 30 },
|
||||
{ header: '标的物描述', key: 'subjectDesc', width: 35 },
|
||||
{ header: '采购数量', key: 'purchaseQty', width: 15 },
|
||||
{ header: '预算金额', key: 'budgetAmount', width: 18 },
|
||||
{ header: '中标金额', key: 'bidAmount', width: 18 },
|
||||
{ header: '实际采购金额', key: 'actualAmount', width: 18 },
|
||||
{ header: '合同金额', key: 'contractAmount', width: 18 },
|
||||
{ header: '结算金额', key: 'settlementAmount', width: 18 },
|
||||
{ header: '采购方式', key: 'purchaseMethod', width: 15 },
|
||||
{ header: '中标供应商名称', key: 'supplierName', width: 30 },
|
||||
{ header: '供应商联系人', key: 'contactPerson', width: 15 },
|
||||
{ header: '供应商联系电话', key: 'contactPhone', width: 18 },
|
||||
{ header: '供应商统一信用代码', key: 'supplierUscc', width: 25 },
|
||||
{ header: '供应商银行账户', key: 'supplierBankAccount', width: 25 },
|
||||
{ header: '采购申请日期', key: 'applyDate', width: 18 },
|
||||
{ header: '采购计划批准日期', key: 'planApproveDate', width: 18 },
|
||||
{ header: '采购公告发布日期', key: 'announceDate', width: 18 },
|
||||
{ header: '开标日期', key: 'bidOpenDate', width: 18 },
|
||||
{ header: '合同签订日期', key: 'contractSignDate', width: 18 },
|
||||
{ header: '预计交货日期', key: 'expectedDeliveryDate', width: 18 },
|
||||
{ header: '实际交货日期', key: 'actualDeliveryDate', width: 18 },
|
||||
{ header: '验收日期', key: 'acceptanceDate', width: 18 },
|
||||
{ header: '结算日期', key: 'settlementDate', width: 18 },
|
||||
{ header: '申请人工号', key: 'applicantId', width: 15 },
|
||||
{ header: '申请人姓名', key: 'applicantName', width: 15 },
|
||||
{ header: '申请部门', key: 'applyDepartment', width: 18 },
|
||||
{ header: '采购负责人工号', key: 'purchaseLeaderId', width: 15 },
|
||||
{ header: '采购负责人姓名', key: 'purchaseLeaderName', width: 15 },
|
||||
{ header: '采购部门', key: 'purchaseDepartment', width: 18 }
|
||||
];
|
||||
|
||||
// 添加数据
|
||||
worksheet.addRows(testData);
|
||||
|
||||
// 设置表头样式
|
||||
const headerRow = worksheet.getRow(1);
|
||||
headerRow.font = { bold: true };
|
||||
headerRow.fill = {
|
||||
type: 'pattern',
|
||||
pattern: 'solid',
|
||||
fgColor: { argb: 'FFE6E6FA' }
|
||||
};
|
||||
|
||||
// 保存文件
|
||||
console.log('正在写入Excel文件...');
|
||||
await workbook.xlsx.writeFile(OUTPUT_FILE);
|
||||
console.log(`✓ 文件已保存: ${OUTPUT_FILE}`);
|
||||
|
||||
// 显示统计信息
|
||||
console.log('\n========================================');
|
||||
console.log('数据统计');
|
||||
console.log('========================================');
|
||||
console.log(`总记录数: ${testData.length}`);
|
||||
console.log(`采购数量范围: ${Math.min(...testData.map(d => d.purchaseQty))} - ${Math.max(...testData.map(d => d.purchaseQty))}`);
|
||||
console.log(`预算金额范围: ${Math.min(...testData.map(d => d.budgetAmount))} - ${Math.max(...testData.map(d => d.budgetAmount))}`);
|
||||
console.log('\n前3条记录预览:');
|
||||
testData.slice(0, 3).forEach((record, index) => {
|
||||
console.log(`\n记录 ${index + 1}:`);
|
||||
console.log(` 采购事项ID: ${record.purchaseId}`);
|
||||
console.log(` 项目名称: ${record.projectName}`);
|
||||
console.log(` 采购数量: ${record.purchaseQty}`);
|
||||
console.log(` 预算金额: ${record.budgetAmount}`);
|
||||
console.log(` 申请人: ${record.applicantName} (${record.applicantId})`);
|
||||
console.log(` 申请部门: ${record.applyDepartment}`);
|
||||
console.log(` 申请日期: ${record.applyDate}`);
|
||||
});
|
||||
}
|
||||
|
||||
// 运行
|
||||
createExcelFile().catch(console.error);
|
||||
16
doc/test-data/purchase_transaction/node_modules/.bin/crc32
generated
vendored
Normal file
16
doc/test-data/purchase_transaction/node_modules/.bin/crc32
generated
vendored
Normal file
@@ -0,0 +1,16 @@
|
||||
#!/bin/sh
|
||||
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
|
||||
|
||||
case `uname` in
|
||||
*CYGWIN*|*MINGW*|*MSYS*)
|
||||
if command -v cygpath > /dev/null 2>&1; then
|
||||
basedir=`cygpath -w "$basedir"`
|
||||
fi
|
||||
;;
|
||||
esac
|
||||
|
||||
if [ -x "$basedir/node" ]; then
|
||||
exec "$basedir/node" "$basedir/../crc-32/bin/crc32.njs" "$@"
|
||||
else
|
||||
exec node "$basedir/../crc-32/bin/crc32.njs" "$@"
|
||||
fi
|
||||
17
doc/test-data/purchase_transaction/node_modules/.bin/crc32.cmd
generated
vendored
Normal file
17
doc/test-data/purchase_transaction/node_modules/.bin/crc32.cmd
generated
vendored
Normal file
@@ -0,0 +1,17 @@
|
||||
@ECHO off
|
||||
GOTO start
|
||||
:find_dp0
|
||||
SET dp0=%~dp0
|
||||
EXIT /b
|
||||
:start
|
||||
SETLOCAL
|
||||
CALL :find_dp0
|
||||
|
||||
IF EXIST "%dp0%\node.exe" (
|
||||
SET "_prog=%dp0%\node.exe"
|
||||
) ELSE (
|
||||
SET "_prog=node"
|
||||
SET PATHEXT=%PATHEXT:;.JS;=;%
|
||||
)
|
||||
|
||||
endLocal & goto #_undefined_# 2>NUL || title %COMSPEC% & "%_prog%" "%dp0%\..\crc-32\bin\crc32.njs" %*
|
||||
28
doc/test-data/purchase_transaction/node_modules/.bin/crc32.ps1
generated
vendored
Normal file
28
doc/test-data/purchase_transaction/node_modules/.bin/crc32.ps1
generated
vendored
Normal file
@@ -0,0 +1,28 @@
|
||||
#!/usr/bin/env pwsh
|
||||
$basedir=Split-Path $MyInvocation.MyCommand.Definition -Parent
|
||||
|
||||
$exe=""
|
||||
if ($PSVersionTable.PSVersion -lt "6.0" -or $IsWindows) {
|
||||
# Fix case when both the Windows and Linux builds of Node
|
||||
# are installed in the same directory
|
||||
$exe=".exe"
|
||||
}
|
||||
$ret=0
|
||||
if (Test-Path "$basedir/node$exe") {
|
||||
# Support pipeline input
|
||||
if ($MyInvocation.ExpectingInput) {
|
||||
$input | & "$basedir/node$exe" "$basedir/../crc-32/bin/crc32.njs" $args
|
||||
} else {
|
||||
& "$basedir/node$exe" "$basedir/../crc-32/bin/crc32.njs" $args
|
||||
}
|
||||
$ret=$LASTEXITCODE
|
||||
} else {
|
||||
# Support pipeline input
|
||||
if ($MyInvocation.ExpectingInput) {
|
||||
$input | & "node$exe" "$basedir/../crc-32/bin/crc32.njs" $args
|
||||
} else {
|
||||
& "node$exe" "$basedir/../crc-32/bin/crc32.njs" $args
|
||||
}
|
||||
$ret=$LASTEXITCODE
|
||||
}
|
||||
exit $ret
|
||||
16
doc/test-data/purchase_transaction/node_modules/.bin/mkdirp
generated
vendored
Normal file
16
doc/test-data/purchase_transaction/node_modules/.bin/mkdirp
generated
vendored
Normal file
@@ -0,0 +1,16 @@
|
||||
#!/bin/sh
|
||||
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
|
||||
|
||||
case `uname` in
|
||||
*CYGWIN*|*MINGW*|*MSYS*)
|
||||
if command -v cygpath > /dev/null 2>&1; then
|
||||
basedir=`cygpath -w "$basedir"`
|
||||
fi
|
||||
;;
|
||||
esac
|
||||
|
||||
if [ -x "$basedir/node" ]; then
|
||||
exec "$basedir/node" "$basedir/../mkdirp/bin/cmd.js" "$@"
|
||||
else
|
||||
exec node "$basedir/../mkdirp/bin/cmd.js" "$@"
|
||||
fi
|
||||
17
doc/test-data/purchase_transaction/node_modules/.bin/mkdirp.cmd
generated
vendored
Normal file
17
doc/test-data/purchase_transaction/node_modules/.bin/mkdirp.cmd
generated
vendored
Normal file
@@ -0,0 +1,17 @@
|
||||
@ECHO off
|
||||
GOTO start
|
||||
:find_dp0
|
||||
SET dp0=%~dp0
|
||||
EXIT /b
|
||||
:start
|
||||
SETLOCAL
|
||||
CALL :find_dp0
|
||||
|
||||
IF EXIST "%dp0%\node.exe" (
|
||||
SET "_prog=%dp0%\node.exe"
|
||||
) ELSE (
|
||||
SET "_prog=node"
|
||||
SET PATHEXT=%PATHEXT:;.JS;=;%
|
||||
)
|
||||
|
||||
endLocal & goto #_undefined_# 2>NUL || title %COMSPEC% & "%_prog%" "%dp0%\..\mkdirp\bin\cmd.js" %*
|
||||
28
doc/test-data/purchase_transaction/node_modules/.bin/mkdirp.ps1
generated
vendored
Normal file
28
doc/test-data/purchase_transaction/node_modules/.bin/mkdirp.ps1
generated
vendored
Normal file
@@ -0,0 +1,28 @@
|
||||
#!/usr/bin/env pwsh
|
||||
$basedir=Split-Path $MyInvocation.MyCommand.Definition -Parent
|
||||
|
||||
$exe=""
|
||||
if ($PSVersionTable.PSVersion -lt "6.0" -or $IsWindows) {
|
||||
# Fix case when both the Windows and Linux builds of Node
|
||||
# are installed in the same directory
|
||||
$exe=".exe"
|
||||
}
|
||||
$ret=0
|
||||
if (Test-Path "$basedir/node$exe") {
|
||||
# Support pipeline input
|
||||
if ($MyInvocation.ExpectingInput) {
|
||||
$input | & "$basedir/node$exe" "$basedir/../mkdirp/bin/cmd.js" $args
|
||||
} else {
|
||||
& "$basedir/node$exe" "$basedir/../mkdirp/bin/cmd.js" $args
|
||||
}
|
||||
$ret=$LASTEXITCODE
|
||||
} else {
|
||||
# Support pipeline input
|
||||
if ($MyInvocation.ExpectingInput) {
|
||||
$input | & "node$exe" "$basedir/../mkdirp/bin/cmd.js" $args
|
||||
} else {
|
||||
& "node$exe" "$basedir/../mkdirp/bin/cmd.js" $args
|
||||
}
|
||||
$ret=$LASTEXITCODE
|
||||
}
|
||||
exit $ret
|
||||
16
doc/test-data/purchase_transaction/node_modules/.bin/rimraf
generated
vendored
Normal file
16
doc/test-data/purchase_transaction/node_modules/.bin/rimraf
generated
vendored
Normal file
@@ -0,0 +1,16 @@
|
||||
#!/bin/sh
|
||||
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
|
||||
|
||||
case `uname` in
|
||||
*CYGWIN*|*MINGW*|*MSYS*)
|
||||
if command -v cygpath > /dev/null 2>&1; then
|
||||
basedir=`cygpath -w "$basedir"`
|
||||
fi
|
||||
;;
|
||||
esac
|
||||
|
||||
if [ -x "$basedir/node" ]; then
|
||||
exec "$basedir/node" "$basedir/../rimraf/bin.js" "$@"
|
||||
else
|
||||
exec node "$basedir/../rimraf/bin.js" "$@"
|
||||
fi
|
||||
17
doc/test-data/purchase_transaction/node_modules/.bin/rimraf.cmd
generated
vendored
Normal file
17
doc/test-data/purchase_transaction/node_modules/.bin/rimraf.cmd
generated
vendored
Normal file
@@ -0,0 +1,17 @@
|
||||
@ECHO off
|
||||
GOTO start
|
||||
:find_dp0
|
||||
SET dp0=%~dp0
|
||||
EXIT /b
|
||||
:start
|
||||
SETLOCAL
|
||||
CALL :find_dp0
|
||||
|
||||
IF EXIST "%dp0%\node.exe" (
|
||||
SET "_prog=%dp0%\node.exe"
|
||||
) ELSE (
|
||||
SET "_prog=node"
|
||||
SET PATHEXT=%PATHEXT:;.JS;=;%
|
||||
)
|
||||
|
||||
endLocal & goto #_undefined_# 2>NUL || title %COMSPEC% & "%_prog%" "%dp0%\..\rimraf\bin.js" %*
|
||||
28
doc/test-data/purchase_transaction/node_modules/.bin/rimraf.ps1
generated
vendored
Normal file
28
doc/test-data/purchase_transaction/node_modules/.bin/rimraf.ps1
generated
vendored
Normal file
@@ -0,0 +1,28 @@
|
||||
#!/usr/bin/env pwsh
|
||||
$basedir=Split-Path $MyInvocation.MyCommand.Definition -Parent
|
||||
|
||||
$exe=""
|
||||
if ($PSVersionTable.PSVersion -lt "6.0" -or $IsWindows) {
|
||||
# Fix case when both the Windows and Linux builds of Node
|
||||
# are installed in the same directory
|
||||
$exe=".exe"
|
||||
}
|
||||
$ret=0
|
||||
if (Test-Path "$basedir/node$exe") {
|
||||
# Support pipeline input
|
||||
if ($MyInvocation.ExpectingInput) {
|
||||
$input | & "$basedir/node$exe" "$basedir/../rimraf/bin.js" $args
|
||||
} else {
|
||||
& "$basedir/node$exe" "$basedir/../rimraf/bin.js" $args
|
||||
}
|
||||
$ret=$LASTEXITCODE
|
||||
} else {
|
||||
# Support pipeline input
|
||||
if ($MyInvocation.ExpectingInput) {
|
||||
$input | & "node$exe" "$basedir/../rimraf/bin.js" $args
|
||||
} else {
|
||||
& "node$exe" "$basedir/../rimraf/bin.js" $args
|
||||
}
|
||||
$ret=$LASTEXITCODE
|
||||
}
|
||||
exit $ret
|
||||
16
doc/test-data/purchase_transaction/node_modules/.bin/uuid
generated
vendored
Normal file
16
doc/test-data/purchase_transaction/node_modules/.bin/uuid
generated
vendored
Normal file
@@ -0,0 +1,16 @@
|
||||
#!/bin/sh
|
||||
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
|
||||
|
||||
case `uname` in
|
||||
*CYGWIN*|*MINGW*|*MSYS*)
|
||||
if command -v cygpath > /dev/null 2>&1; then
|
||||
basedir=`cygpath -w "$basedir"`
|
||||
fi
|
||||
;;
|
||||
esac
|
||||
|
||||
if [ -x "$basedir/node" ]; then
|
||||
exec "$basedir/node" "$basedir/../uuid/dist/bin/uuid" "$@"
|
||||
else
|
||||
exec node "$basedir/../uuid/dist/bin/uuid" "$@"
|
||||
fi
|
||||
17
doc/test-data/purchase_transaction/node_modules/.bin/uuid.cmd
generated
vendored
Normal file
17
doc/test-data/purchase_transaction/node_modules/.bin/uuid.cmd
generated
vendored
Normal file
@@ -0,0 +1,17 @@
|
||||
@ECHO off
|
||||
GOTO start
|
||||
:find_dp0
|
||||
SET dp0=%~dp0
|
||||
EXIT /b
|
||||
:start
|
||||
SETLOCAL
|
||||
CALL :find_dp0
|
||||
|
||||
IF EXIST "%dp0%\node.exe" (
|
||||
SET "_prog=%dp0%\node.exe"
|
||||
) ELSE (
|
||||
SET "_prog=node"
|
||||
SET PATHEXT=%PATHEXT:;.JS;=;%
|
||||
)
|
||||
|
||||
endLocal & goto #_undefined_# 2>NUL || title %COMSPEC% & "%_prog%" "%dp0%\..\uuid\dist\bin\uuid" %*
|
||||
28
doc/test-data/purchase_transaction/node_modules/.bin/uuid.ps1
generated
vendored
Normal file
28
doc/test-data/purchase_transaction/node_modules/.bin/uuid.ps1
generated
vendored
Normal file
@@ -0,0 +1,28 @@
|
||||
#!/usr/bin/env pwsh
|
||||
$basedir=Split-Path $MyInvocation.MyCommand.Definition -Parent
|
||||
|
||||
$exe=""
|
||||
if ($PSVersionTable.PSVersion -lt "6.0" -or $IsWindows) {
|
||||
# Fix case when both the Windows and Linux builds of Node
|
||||
# are installed in the same directory
|
||||
$exe=".exe"
|
||||
}
|
||||
$ret=0
|
||||
if (Test-Path "$basedir/node$exe") {
|
||||
# Support pipeline input
|
||||
if ($MyInvocation.ExpectingInput) {
|
||||
$input | & "$basedir/node$exe" "$basedir/../uuid/dist/bin/uuid" $args
|
||||
} else {
|
||||
& "$basedir/node$exe" "$basedir/../uuid/dist/bin/uuid" $args
|
||||
}
|
||||
$ret=$LASTEXITCODE
|
||||
} else {
|
||||
# Support pipeline input
|
||||
if ($MyInvocation.ExpectingInput) {
|
||||
$input | & "node$exe" "$basedir/../uuid/dist/bin/uuid" $args
|
||||
} else {
|
||||
& "node$exe" "$basedir/../uuid/dist/bin/uuid" $args
|
||||
}
|
||||
$ret=$LASTEXITCODE
|
||||
}
|
||||
exit $ret
|
||||
1275
doc/test-data/purchase_transaction/node_modules/.package-lock.json
generated
vendored
Normal file
1275
doc/test-data/purchase_transaction/node_modules/.package-lock.json
generated
vendored
Normal file
File diff suppressed because it is too large
Load Diff
73
doc/test-data/purchase_transaction/node_modules/@fast-csv/format/CHANGELOG.md
generated
vendored
Normal file
73
doc/test-data/purchase_transaction/node_modules/@fast-csv/format/CHANGELOG.md
generated
vendored
Normal file
@@ -0,0 +1,73 @@
|
||||
# Change Log
|
||||
|
||||
All notable changes to this project will be documented in this file.
|
||||
See [Conventional Commits](https://conventionalcommits.org) for commit guidelines.
|
||||
|
||||
## [4.3.5](https://github.com/C2FO/fast-csv/compare/v4.3.4...v4.3.5) (2020-11-03)
|
||||
|
||||
|
||||
### Bug Fixes
|
||||
|
||||
* **formatting,#446:** Do not quote fields that do not contain a quote ([13e688c](https://github.com/C2FO/fast-csv/commit/13e688cb38dcb67c7182211968c794146be54692)), closes [#446](https://github.com/C2FO/fast-csv/issues/446)
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
## [4.3.4](https://github.com/C2FO/fast-csv/compare/v4.3.3...v4.3.4) (2020-11-03)
|
||||
|
||||
|
||||
### Bug Fixes
|
||||
|
||||
* **formatter,#503:** Do not ignore rows when headers is false ([1560564](https://github.com/C2FO/fast-csv/commit/1560564819c8b1254ca4ad43487830a4296570f6)), closes [#503](https://github.com/C2FO/fast-csv/issues/503)
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
## [4.3.3](https://github.com/C2FO/fast-csv/compare/v4.3.2...v4.3.3) (2020-10-30)
|
||||
|
||||
**Note:** Version bump only for package @fast-csv/format
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
## [4.3.1](https://github.com/C2FO/fast-csv/compare/v4.3.0...v4.3.1) (2020-06-23)
|
||||
|
||||
**Note:** Version bump only for package @fast-csv/format
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
# [4.3.0](https://github.com/C2FO/fast-csv/compare/v4.2.0...v4.3.0) (2020-05-27)
|
||||
|
||||
**Note:** Version bump only for package @fast-csv/format
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
# [4.2.0](https://github.com/C2FO/fast-csv/compare/v4.1.6...v4.2.0) (2020-05-19)
|
||||
|
||||
|
||||
### Features
|
||||
|
||||
* **parsing:** Less restrictive row parsing type [#356](https://github.com/C2FO/fast-csv/issues/356) ([87d74ec](https://github.com/C2FO/fast-csv/commit/87d74ecd2cb16f3700b1942ebbbec221afe38790))
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
## [4.1.5](https://github.com/C2FO/fast-csv/compare/v4.1.4...v4.1.5) (2020-05-15)
|
||||
|
||||
**Note:** Version bump only for package @fast-csv/format
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
## [4.1.4](https://github.com/C2FO/fast-csv/compare/v4.1.3...v4.1.4) (2020-05-15)
|
||||
|
||||
**Note:** Version bump only for package @fast-csv/format
|
||||
21
doc/test-data/purchase_transaction/node_modules/@fast-csv/format/LICENSE
generated
vendored
Normal file
21
doc/test-data/purchase_transaction/node_modules/@fast-csv/format/LICENSE
generated
vendored
Normal file
@@ -0,0 +1,21 @@
|
||||
The MIT License
|
||||
|
||||
Copyright (c) 2011-2019 C2FO
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in
|
||||
all copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||
THE SOFTWARE.
|
||||
20
doc/test-data/purchase_transaction/node_modules/@fast-csv/format/README.md
generated
vendored
Normal file
20
doc/test-data/purchase_transaction/node_modules/@fast-csv/format/README.md
generated
vendored
Normal file
@@ -0,0 +1,20 @@
|
||||
<p align="center">
|
||||
<a href="https://c2fo.io/fast-csv" target="blank"><img src="https://c2fo.io/fast-csv/img/logo.svg" width="200" alt="fast-csv Logo" /></a>
|
||||
</p>
|
||||
|
||||
[](https://www.npmjs.org/package/@fast-csv/format)
|
||||
[](https://travis-ci.org/C2FO/fast-csv)
|
||||
[](https://coveralls.io/github/C2FO/fast-csv?branch=master)
|
||||
[](https://snyk.io/test/github/C2FO/fast-csv?targetFile=packages/format/package.json)
|
||||
|
||||
# `@fast-csv/format`
|
||||
|
||||
`fast-csv` package to format CSVs.
|
||||
|
||||
## Installation
|
||||
|
||||
[Install Guide](https://c2fo.io/fast-csv/docs/introduction/install)
|
||||
|
||||
## Usage
|
||||
|
||||
To get started with `@fast-csv/format` [check out the docs](https://c2fo.io/fast-csv/docs/formatting/getting-started)
|
||||
13
doc/test-data/purchase_transaction/node_modules/@fast-csv/format/build/src/CsvFormatterStream.d.ts
generated
vendored
Normal file
13
doc/test-data/purchase_transaction/node_modules/@fast-csv/format/build/src/CsvFormatterStream.d.ts
generated
vendored
Normal file
@@ -0,0 +1,13 @@
|
||||
/// <reference types="node" />
|
||||
import { Transform, TransformCallback } from 'stream';
|
||||
import { FormatterOptions } from './FormatterOptions';
|
||||
import { Row, RowTransformFunction } from './types';
|
||||
export declare class CsvFormatterStream<I extends Row, O extends Row> extends Transform {
|
||||
private formatterOptions;
|
||||
private rowFormatter;
|
||||
private hasWrittenBOM;
|
||||
constructor(formatterOptions: FormatterOptions<I, O>);
|
||||
transform(transformFunction: RowTransformFunction<I, O>): CsvFormatterStream<I, O>;
|
||||
_transform(row: I, encoding: string, cb: TransformCallback): void;
|
||||
_flush(cb: TransformCallback): void;
|
||||
}
|
||||
63
doc/test-data/purchase_transaction/node_modules/@fast-csv/format/build/src/CsvFormatterStream.js
generated
vendored
Normal file
63
doc/test-data/purchase_transaction/node_modules/@fast-csv/format/build/src/CsvFormatterStream.js
generated
vendored
Normal file
@@ -0,0 +1,63 @@
|
||||
"use strict";
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.CsvFormatterStream = void 0;
|
||||
const stream_1 = require("stream");
|
||||
const formatter_1 = require("./formatter");
|
||||
class CsvFormatterStream extends stream_1.Transform {
|
||||
constructor(formatterOptions) {
|
||||
super({ writableObjectMode: formatterOptions.objectMode });
|
||||
this.hasWrittenBOM = false;
|
||||
this.formatterOptions = formatterOptions;
|
||||
this.rowFormatter = new formatter_1.RowFormatter(formatterOptions);
|
||||
// if writeBOM is false then set to true
|
||||
// if writeBOM is true then set to false by default so it is written out
|
||||
this.hasWrittenBOM = !formatterOptions.writeBOM;
|
||||
}
|
||||
transform(transformFunction) {
|
||||
this.rowFormatter.rowTransform = transformFunction;
|
||||
return this;
|
||||
}
|
||||
_transform(row, encoding, cb) {
|
||||
let cbCalled = false;
|
||||
try {
|
||||
if (!this.hasWrittenBOM) {
|
||||
this.push(this.formatterOptions.BOM);
|
||||
this.hasWrittenBOM = true;
|
||||
}
|
||||
this.rowFormatter.format(row, (err, rows) => {
|
||||
if (err) {
|
||||
cbCalled = true;
|
||||
return cb(err);
|
||||
}
|
||||
if (rows) {
|
||||
rows.forEach((r) => {
|
||||
this.push(Buffer.from(r, 'utf8'));
|
||||
});
|
||||
}
|
||||
cbCalled = true;
|
||||
return cb();
|
||||
});
|
||||
}
|
||||
catch (e) {
|
||||
if (cbCalled) {
|
||||
throw e;
|
||||
}
|
||||
cb(e);
|
||||
}
|
||||
}
|
||||
_flush(cb) {
|
||||
this.rowFormatter.finish((err, rows) => {
|
||||
if (err) {
|
||||
return cb(err);
|
||||
}
|
||||
if (rows) {
|
||||
rows.forEach((r) => {
|
||||
this.push(Buffer.from(r, 'utf8'));
|
||||
});
|
||||
}
|
||||
return cb();
|
||||
});
|
||||
}
|
||||
}
|
||||
exports.CsvFormatterStream = CsvFormatterStream;
|
||||
//# sourceMappingURL=CsvFormatterStream.js.map
|
||||
1
doc/test-data/purchase_transaction/node_modules/@fast-csv/format/build/src/CsvFormatterStream.js.map
generated
vendored
Normal file
1
doc/test-data/purchase_transaction/node_modules/@fast-csv/format/build/src/CsvFormatterStream.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"CsvFormatterStream.js","sourceRoot":"","sources":["../../src/CsvFormatterStream.ts"],"names":[],"mappings":";;;AAAA,mCAAsD;AAGtD,2CAA2C;AAE3C,MAAa,kBAAiD,SAAQ,kBAAS;IAO3E,YAAmB,gBAAwC;QACvD,KAAK,CAAC,EAAE,kBAAkB,EAAE,gBAAgB,CAAC,UAAU,EAAE,CAAC,CAAC;QAHvD,kBAAa,GAAG,KAAK,CAAC;QAI1B,IAAI,CAAC,gBAAgB,GAAG,gBAAgB,CAAC;QACzC,IAAI,CAAC,YAAY,GAAG,IAAI,wBAAY,CAAC,gBAAgB,CAAC,CAAC;QACvD,wCAAwC;QACxC,wEAAwE;QACxE,IAAI,CAAC,aAAa,GAAG,CAAC,gBAAgB,CAAC,QAAQ,CAAC;IACpD,CAAC;IAEM,SAAS,CAAC,iBAA6C;QAC1D,IAAI,CAAC,YAAY,CAAC,YAAY,GAAG,iBAAiB,CAAC;QACnD,OAAO,IAAI,CAAC;IAChB,CAAC;IAEM,UAAU,CAAC,GAAM,EAAE,QAAgB,EAAE,EAAqB;QAC7D,IAAI,QAAQ,GAAG,KAAK,CAAC;QACrB,IAAI;YACA,IAAI,CAAC,IAAI,CAAC,aAAa,EAAE;gBACrB,IAAI,CAAC,IAAI,CAAC,IAAI,CAAC,gBAAgB,CAAC,GAAG,CAAC,CAAC;gBACrC,IAAI,CAAC,aAAa,GAAG,IAAI,CAAC;aAC7B;YACD,IAAI,CAAC,YAAY,CAAC,MAAM,CAAC,GAAG,EAAE,CAAC,GAAG,EAAE,IAAI,EAAQ,EAAE;gBAC9C,IAAI,GAAG,EAAE;oBACL,QAAQ,GAAG,IAAI,CAAC;oBAChB,OAAO,EAAE,CAAC,GAAG,CAAC,CAAC;iBAClB;gBACD,IAAI,IAAI,EAAE;oBACN,IAAI,CAAC,OAAO,CAAC,CAAC,CAAC,EAAQ,EAAE;wBACrB,IAAI,CAAC,IAAI,CAAC,MAAM,CAAC,IAAI,CAAC,CAAC,EAAE,MAAM,CAAC,CAAC,CAAC;oBACtC,CAAC,CAAC,CAAC;iBACN;gBACD,QAAQ,GAAG,IAAI,CAAC;gBAChB,OAAO,EAAE,EAAE,CAAC;YAChB,CAAC,CAAC,CAAC;SACN;QAAC,OAAO,CAAC,EAAE;YACR,IAAI,QAAQ,EAAE;gBACV,MAAM,CAAC,CAAC;aACX;YACD,EAAE,CAAC,CAAC,CAAC,CAAC;SACT;IACL,CAAC;IAEM,MAAM,CAAC,EAAqB;QAC/B,IAAI,CAAC,YAAY,CAAC,MAAM,CAAC,CAAC,GAAG,EAAE,IAAI,EAAQ,EAAE;YACzC,IAAI,GAAG,EAAE;gBACL,OAAO,EAAE,CAAC,GAAG,CAAC,CAAC;aAClB;YACD,IAAI,IAAI,EAAE;gBACN,IAAI,CAAC,OAAO,CAAC,CAAC,CAAC,EAAQ,EAAE;oBACrB,IAAI,CAAC,IAAI,CAAC,MAAM,CAAC,IAAI,CAAC,CAAC,EAAE,MAAM,CAAC,CAAC,CAAC;gBACtC,CAAC,CAAC,CAAC;aACN;YACD,OAAO,EAAE,EAAE,CAAC;QAChB,CAAC,CAAC,CAAC;IACP,CAAC;CACJ;AA9DD,gDA8DC"}
|
||||
39
doc/test-data/purchase_transaction/node_modules/@fast-csv/format/build/src/FormatterOptions.d.ts
generated
vendored
Normal file
39
doc/test-data/purchase_transaction/node_modules/@fast-csv/format/build/src/FormatterOptions.d.ts
generated
vendored
Normal file
@@ -0,0 +1,39 @@
|
||||
import { Row, RowTransformFunction } from './types';
|
||||
interface QuoteColumnMap {
|
||||
[s: string]: boolean;
|
||||
}
|
||||
declare type QuoteColumns = boolean | boolean[] | QuoteColumnMap;
|
||||
export interface FormatterOptionsArgs<I extends Row, O extends Row> {
|
||||
objectMode?: boolean;
|
||||
delimiter?: string;
|
||||
rowDelimiter?: string;
|
||||
quote?: string | boolean;
|
||||
escape?: string;
|
||||
quoteColumns?: QuoteColumns;
|
||||
quoteHeaders?: QuoteColumns;
|
||||
headers?: null | boolean | string[];
|
||||
writeHeaders?: boolean;
|
||||
includeEndRowDelimiter?: boolean;
|
||||
writeBOM?: boolean;
|
||||
transform?: RowTransformFunction<I, O>;
|
||||
alwaysWriteHeaders?: boolean;
|
||||
}
|
||||
export declare class FormatterOptions<I extends Row, O extends Row> {
|
||||
readonly objectMode: boolean;
|
||||
readonly delimiter: string;
|
||||
readonly rowDelimiter: string;
|
||||
readonly quote: string;
|
||||
readonly escape: string;
|
||||
readonly quoteColumns: QuoteColumns;
|
||||
readonly quoteHeaders: QuoteColumns;
|
||||
readonly headers: null | string[];
|
||||
readonly includeEndRowDelimiter: boolean;
|
||||
readonly transform?: RowTransformFunction<I, O>;
|
||||
readonly shouldWriteHeaders: boolean;
|
||||
readonly writeBOM: boolean;
|
||||
readonly escapedQuote: string;
|
||||
readonly BOM: string;
|
||||
readonly alwaysWriteHeaders: boolean;
|
||||
constructor(opts?: FormatterOptionsArgs<I, O>);
|
||||
}
|
||||
export {};
|
||||
38
doc/test-data/purchase_transaction/node_modules/@fast-csv/format/build/src/FormatterOptions.js
generated
vendored
Normal file
38
doc/test-data/purchase_transaction/node_modules/@fast-csv/format/build/src/FormatterOptions.js
generated
vendored
Normal file
@@ -0,0 +1,38 @@
|
||||
"use strict";
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.FormatterOptions = void 0;
|
||||
class FormatterOptions {
|
||||
constructor(opts = {}) {
|
||||
var _a;
|
||||
this.objectMode = true;
|
||||
this.delimiter = ',';
|
||||
this.rowDelimiter = '\n';
|
||||
this.quote = '"';
|
||||
this.escape = this.quote;
|
||||
this.quoteColumns = false;
|
||||
this.quoteHeaders = this.quoteColumns;
|
||||
this.headers = null;
|
||||
this.includeEndRowDelimiter = false;
|
||||
this.writeBOM = false;
|
||||
this.BOM = '\ufeff';
|
||||
this.alwaysWriteHeaders = false;
|
||||
Object.assign(this, opts || {});
|
||||
if (typeof (opts === null || opts === void 0 ? void 0 : opts.quoteHeaders) === 'undefined') {
|
||||
this.quoteHeaders = this.quoteColumns;
|
||||
}
|
||||
if ((opts === null || opts === void 0 ? void 0 : opts.quote) === true) {
|
||||
this.quote = '"';
|
||||
}
|
||||
else if ((opts === null || opts === void 0 ? void 0 : opts.quote) === false) {
|
||||
this.quote = '';
|
||||
}
|
||||
if (typeof (opts === null || opts === void 0 ? void 0 : opts.escape) !== 'string') {
|
||||
this.escape = this.quote;
|
||||
}
|
||||
this.shouldWriteHeaders = !!this.headers && ((_a = opts.writeHeaders) !== null && _a !== void 0 ? _a : true);
|
||||
this.headers = Array.isArray(this.headers) ? this.headers : null;
|
||||
this.escapedQuote = `${this.escape}${this.quote}`;
|
||||
}
|
||||
}
|
||||
exports.FormatterOptions = FormatterOptions;
|
||||
//# sourceMappingURL=FormatterOptions.js.map
|
||||
1
doc/test-data/purchase_transaction/node_modules/@fast-csv/format/build/src/FormatterOptions.js.map
generated
vendored
Normal file
1
doc/test-data/purchase_transaction/node_modules/@fast-csv/format/build/src/FormatterOptions.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"FormatterOptions.js","sourceRoot":"","sources":["../../src/FormatterOptions.ts"],"names":[],"mappings":";;;AAwBA,MAAa,gBAAgB;IA+BzB,YAAmB,OAAmC,EAAE;;QA9BxC,eAAU,GAAY,IAAI,CAAC;QAE3B,cAAS,GAAW,GAAG,CAAC;QAExB,iBAAY,GAAW,IAAI,CAAC;QAE5B,UAAK,GAAW,GAAG,CAAC;QAEpB,WAAM,GAAW,IAAI,CAAC,KAAK,CAAC;QAE5B,iBAAY,GAAiB,KAAK,CAAC;QAEnC,iBAAY,GAAiB,IAAI,CAAC,YAAY,CAAC;QAE/C,YAAO,GAAoB,IAAI,CAAC;QAEhC,2BAAsB,GAAY,KAAK,CAAC;QAMxC,aAAQ,GAAY,KAAK,CAAC;QAI1B,QAAG,GAAW,QAAQ,CAAC;QAEvB,uBAAkB,GAAY,KAAK,CAAC;QAGhD,MAAM,CAAC,MAAM,CAAC,IAAI,EAAE,IAAI,IAAI,EAAE,CAAC,CAAC;QAEhC,IAAI,QAAO,IAAI,aAAJ,IAAI,uBAAJ,IAAI,CAAE,YAAY,CAAA,KAAK,WAAW,EAAE;YAC3C,IAAI,CAAC,YAAY,GAAG,IAAI,CAAC,YAAY,CAAC;SACzC;QACD,IAAI,CAAA,IAAI,aAAJ,IAAI,uBAAJ,IAAI,CAAE,KAAK,MAAK,IAAI,EAAE;YACtB,IAAI,CAAC,KAAK,GAAG,GAAG,CAAC;SACpB;aAAM,IAAI,CAAA,IAAI,aAAJ,IAAI,uBAAJ,IAAI,CAAE,KAAK,MAAK,KAAK,EAAE;YAC9B,IAAI,CAAC,KAAK,GAAG,EAAE,CAAC;SACnB;QACD,IAAI,QAAO,IAAI,aAAJ,IAAI,uBAAJ,IAAI,CAAE,MAAM,CAAA,KAAK,QAAQ,EAAE;YAClC,IAAI,CAAC,MAAM,GAAG,IAAI,CAAC,KAAK,CAAC;SAC5B;QACD,IAAI,CAAC,kBAAkB,GAAG,CAAC,CAAC,IAAI,CAAC,OAAO,IAAI,OAAC,IAAI,CAAC,YAAY,mCAAI,IAAI,CAAC,CAAC;QACxE,IAAI,CAAC,OAAO,GAAG,KAAK,CAAC,OAAO,CAAC,IAAI,CAAC,OAAO,CAAC,CAAC,CAAC,CAAC,IAAI,CAAC,OAAO,CAAC,CAAC,CAAC,IAAI,CAAC;QACjE,IAAI,CAAC,YAAY,GAAG,GAAG,IAAI,CAAC,MAAM,GAAG,IAAI,CAAC,KAAK,EAAE,CAAC;IACtD,CAAC;CACJ;AAjDD,4CAiDC"}
|
||||
13
doc/test-data/purchase_transaction/node_modules/@fast-csv/format/build/src/formatter/FieldFormatter.d.ts
generated
vendored
Normal file
13
doc/test-data/purchase_transaction/node_modules/@fast-csv/format/build/src/formatter/FieldFormatter.d.ts
generated
vendored
Normal file
@@ -0,0 +1,13 @@
|
||||
import { FormatterOptions } from '../FormatterOptions';
|
||||
import { Row } from '../types';
|
||||
export declare class FieldFormatter<I extends Row, O extends Row> {
|
||||
private readonly formatterOptions;
|
||||
private _headers;
|
||||
private readonly REPLACE_REGEXP;
|
||||
private readonly ESCAPE_REGEXP;
|
||||
constructor(formatterOptions: FormatterOptions<I, O>);
|
||||
set headers(headers: string[]);
|
||||
private shouldQuote;
|
||||
format(field: string, fieldIndex: number, isHeader: boolean): string;
|
||||
private quoteField;
|
||||
}
|
||||
58
doc/test-data/purchase_transaction/node_modules/@fast-csv/format/build/src/formatter/FieldFormatter.js
generated
vendored
Normal file
58
doc/test-data/purchase_transaction/node_modules/@fast-csv/format/build/src/formatter/FieldFormatter.js
generated
vendored
Normal file
@@ -0,0 +1,58 @@
|
||||
"use strict";
|
||||
var __importDefault = (this && this.__importDefault) || function (mod) {
|
||||
return (mod && mod.__esModule) ? mod : { "default": mod };
|
||||
};
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.FieldFormatter = void 0;
|
||||
const lodash_isboolean_1 = __importDefault(require("lodash.isboolean"));
|
||||
const lodash_isnil_1 = __importDefault(require("lodash.isnil"));
|
||||
const lodash_escaperegexp_1 = __importDefault(require("lodash.escaperegexp"));
|
||||
class FieldFormatter {
|
||||
constructor(formatterOptions) {
|
||||
this._headers = null;
|
||||
this.formatterOptions = formatterOptions;
|
||||
if (formatterOptions.headers !== null) {
|
||||
this.headers = formatterOptions.headers;
|
||||
}
|
||||
this.REPLACE_REGEXP = new RegExp(formatterOptions.quote, 'g');
|
||||
const escapePattern = `[${formatterOptions.delimiter}${lodash_escaperegexp_1.default(formatterOptions.rowDelimiter)}|\r|\n]`;
|
||||
this.ESCAPE_REGEXP = new RegExp(escapePattern);
|
||||
}
|
||||
set headers(headers) {
|
||||
this._headers = headers;
|
||||
}
|
||||
shouldQuote(fieldIndex, isHeader) {
|
||||
const quoteConfig = isHeader ? this.formatterOptions.quoteHeaders : this.formatterOptions.quoteColumns;
|
||||
if (lodash_isboolean_1.default(quoteConfig)) {
|
||||
return quoteConfig;
|
||||
}
|
||||
if (Array.isArray(quoteConfig)) {
|
||||
return quoteConfig[fieldIndex];
|
||||
}
|
||||
if (this._headers !== null) {
|
||||
return quoteConfig[this._headers[fieldIndex]];
|
||||
}
|
||||
return false;
|
||||
}
|
||||
format(field, fieldIndex, isHeader) {
|
||||
const preparedField = `${lodash_isnil_1.default(field) ? '' : field}`.replace(/\0/g, '');
|
||||
const { formatterOptions } = this;
|
||||
if (formatterOptions.quote !== '') {
|
||||
const shouldEscape = preparedField.indexOf(formatterOptions.quote) !== -1;
|
||||
if (shouldEscape) {
|
||||
return this.quoteField(preparedField.replace(this.REPLACE_REGEXP, formatterOptions.escapedQuote));
|
||||
}
|
||||
}
|
||||
const hasEscapeCharacters = preparedField.search(this.ESCAPE_REGEXP) !== -1;
|
||||
if (hasEscapeCharacters || this.shouldQuote(fieldIndex, isHeader)) {
|
||||
return this.quoteField(preparedField);
|
||||
}
|
||||
return preparedField;
|
||||
}
|
||||
quoteField(field) {
|
||||
const { quote } = this.formatterOptions;
|
||||
return `${quote}${field}${quote}`;
|
||||
}
|
||||
}
|
||||
exports.FieldFormatter = FieldFormatter;
|
||||
//# sourceMappingURL=FieldFormatter.js.map
|
||||
1
doc/test-data/purchase_transaction/node_modules/@fast-csv/format/build/src/formatter/FieldFormatter.js.map
generated
vendored
Normal file
1
doc/test-data/purchase_transaction/node_modules/@fast-csv/format/build/src/formatter/FieldFormatter.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"FieldFormatter.js","sourceRoot":"","sources":["../../../src/formatter/FieldFormatter.ts"],"names":[],"mappings":";;;;;;AAAA,wEAAyC;AACzC,gEAAiC;AACjC,8EAA+C;AAI/C,MAAa,cAAc;IASvB,YAAmB,gBAAwC;QANnD,aAAQ,GAAoB,IAAI,CAAC;QAOrC,IAAI,CAAC,gBAAgB,GAAG,gBAAgB,CAAC;QACzC,IAAI,gBAAgB,CAAC,OAAO,KAAK,IAAI,EAAE;YACnC,IAAI,CAAC,OAAO,GAAG,gBAAgB,CAAC,OAAO,CAAC;SAC3C;QACD,IAAI,CAAC,cAAc,GAAG,IAAI,MAAM,CAAC,gBAAgB,CAAC,KAAK,EAAE,GAAG,CAAC,CAAC;QAC9D,MAAM,aAAa,GAAG,IAAI,gBAAgB,CAAC,SAAS,GAAG,6BAAY,CAAC,gBAAgB,CAAC,YAAY,CAAC,SAAS,CAAC;QAC5G,IAAI,CAAC,aAAa,GAAG,IAAI,MAAM,CAAC,aAAa,CAAC,CAAC;IACnD,CAAC;IAED,IAAW,OAAO,CAAC,OAAiB;QAChC,IAAI,CAAC,QAAQ,GAAG,OAAO,CAAC;IAC5B,CAAC;IAEO,WAAW,CAAC,UAAkB,EAAE,QAAiB;QACrD,MAAM,WAAW,GAAG,QAAQ,CAAC,CAAC,CAAC,IAAI,CAAC,gBAAgB,CAAC,YAAY,CAAC,CAAC,CAAC,IAAI,CAAC,gBAAgB,CAAC,YAAY,CAAC;QACvG,IAAI,0BAAS,CAAC,WAAW,CAAC,EAAE;YACxB,OAAO,WAAW,CAAC;SACtB;QACD,IAAI,KAAK,CAAC,OAAO,CAAC,WAAW,CAAC,EAAE;YAC5B,OAAO,WAAW,CAAC,UAAU,CAAC,CAAC;SAClC;QACD,IAAI,IAAI,CAAC,QAAQ,KAAK,IAAI,EAAE;YACxB,OAAO,WAAW,CAAC,IAAI,CAAC,QAAQ,CAAC,UAAU,CAAC,CAAC,CAAC;SACjD;QACD,OAAO,KAAK,CAAC;IACjB,CAAC;IAEM,MAAM,CAAC,KAAa,EAAE,UAAkB,EAAE,QAAiB;QAC9D,MAAM,aAAa,GAAG,GAAG,sBAAK,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC,EAAE,CAAC,CAAC,CAAC,KAAK,EAAE,CAAC,OAAO,CAAC,KAAK,EAAE,EAAE,CAAC,CAAC;QACxE,MAAM,EAAE,gBAAgB,EAAE,GAAG,IAAI,CAAC;QAClC,IAAI,gBAAgB,CAAC,KAAK,KAAK,EAAE,EAAE;YAC/B,MAAM,YAAY,GAAG,aAAa,CAAC,OAAO,CAAC,gBAAgB,CAAC,KAAK,CAAC,KAAK,CAAC,CAAC,CAAC;YAC1E,IAAI,YAAY,EAAE;gBACd,OAAO,IAAI,CAAC,UAAU,CAAC,aAAa,CAAC,OAAO,CAAC,IAAI,CAAC,cAAc,EAAE,gBAAgB,CAAC,YAAY,CAAC,CAAC,CAAC;aACrG;SACJ;QACD,MAAM,mBAAmB,GAAG,aAAa,CAAC,MAAM,CAAC,IAAI,CAAC,aAAa,CAAC,KAAK,CAAC,CAAC,CAAC;QAC5E,IAAI,mBAAmB,IAAI,IAAI,CAAC,WAAW,CAAC,UAAU,EAAE,QAAQ,CAAC,EAAE;YAC/D,OAAO,IAAI,CAAC,UAAU,CAAC,aAAa,CAAC,CAAC;SACzC;QACD,OAAO,aAAa,CAAC;IACzB,CAAC;IAEO,UAAU,CAAC,KAAa;QAC5B,MAAM,EAAE,KAAK,EAAE,GAAG,IAAI,CAAC,gBAAgB,CAAC;QACxC,OAAO,GAAG,KAAK,GAAG,KAAK,GAAG,KAAK,EAAE,CAAC;IACtC,CAAC;CACJ;AAzDD,wCAyDC"}
|
||||
25
doc/test-data/purchase_transaction/node_modules/@fast-csv/format/build/src/formatter/RowFormatter.d.ts
generated
vendored
Normal file
25
doc/test-data/purchase_transaction/node_modules/@fast-csv/format/build/src/formatter/RowFormatter.d.ts
generated
vendored
Normal file
@@ -0,0 +1,25 @@
|
||||
import { FormatterOptions } from '../FormatterOptions';
|
||||
import { Row, RowArray, RowTransformFunction } from '../types';
|
||||
declare type RowFormatterCallback = (error: Error | null, data?: RowArray) => void;
|
||||
export declare class RowFormatter<I extends Row, O extends Row> {
|
||||
private static isRowHashArray;
|
||||
private static isRowArray;
|
||||
private static gatherHeaders;
|
||||
private static createTransform;
|
||||
private readonly formatterOptions;
|
||||
private readonly fieldFormatter;
|
||||
private readonly shouldWriteHeaders;
|
||||
private _rowTransform?;
|
||||
private headers;
|
||||
private hasWrittenHeaders;
|
||||
private rowCount;
|
||||
constructor(formatterOptions: FormatterOptions<I, O>);
|
||||
set rowTransform(transformFunction: RowTransformFunction<I, O>);
|
||||
format(row: I, cb: RowFormatterCallback): void;
|
||||
finish(cb: RowFormatterCallback): void;
|
||||
private checkHeaders;
|
||||
private gatherColumns;
|
||||
private callTransformer;
|
||||
private formatColumns;
|
||||
}
|
||||
export {};
|
||||
168
doc/test-data/purchase_transaction/node_modules/@fast-csv/format/build/src/formatter/RowFormatter.js
generated
vendored
Normal file
168
doc/test-data/purchase_transaction/node_modules/@fast-csv/format/build/src/formatter/RowFormatter.js
generated
vendored
Normal file
@@ -0,0 +1,168 @@
|
||||
"use strict";
|
||||
var __importDefault = (this && this.__importDefault) || function (mod) {
|
||||
return (mod && mod.__esModule) ? mod : { "default": mod };
|
||||
};
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.RowFormatter = void 0;
|
||||
const lodash_isfunction_1 = __importDefault(require("lodash.isfunction"));
|
||||
const lodash_isequal_1 = __importDefault(require("lodash.isequal"));
|
||||
const FieldFormatter_1 = require("./FieldFormatter");
|
||||
const types_1 = require("../types");
|
||||
class RowFormatter {
|
||||
constructor(formatterOptions) {
|
||||
this.rowCount = 0;
|
||||
this.formatterOptions = formatterOptions;
|
||||
this.fieldFormatter = new FieldFormatter_1.FieldFormatter(formatterOptions);
|
||||
this.headers = formatterOptions.headers;
|
||||
this.shouldWriteHeaders = formatterOptions.shouldWriteHeaders;
|
||||
this.hasWrittenHeaders = false;
|
||||
if (this.headers !== null) {
|
||||
this.fieldFormatter.headers = this.headers;
|
||||
}
|
||||
if (formatterOptions.transform) {
|
||||
this.rowTransform = formatterOptions.transform;
|
||||
}
|
||||
}
|
||||
static isRowHashArray(row) {
|
||||
if (Array.isArray(row)) {
|
||||
return Array.isArray(row[0]) && row[0].length === 2;
|
||||
}
|
||||
return false;
|
||||
}
|
||||
static isRowArray(row) {
|
||||
return Array.isArray(row) && !this.isRowHashArray(row);
|
||||
}
|
||||
// get headers from a row item
|
||||
static gatherHeaders(row) {
|
||||
if (RowFormatter.isRowHashArray(row)) {
|
||||
// lets assume a multi-dimesional array with item 0 being the header
|
||||
return row.map((it) => it[0]);
|
||||
}
|
||||
if (Array.isArray(row)) {
|
||||
return row;
|
||||
}
|
||||
return Object.keys(row);
|
||||
}
|
||||
// eslint-disable-next-line @typescript-eslint/no-shadow
|
||||
static createTransform(transformFunction) {
|
||||
if (types_1.isSyncTransform(transformFunction)) {
|
||||
return (row, cb) => {
|
||||
let transformedRow = null;
|
||||
try {
|
||||
transformedRow = transformFunction(row);
|
||||
}
|
||||
catch (e) {
|
||||
return cb(e);
|
||||
}
|
||||
return cb(null, transformedRow);
|
||||
};
|
||||
}
|
||||
return (row, cb) => {
|
||||
transformFunction(row, cb);
|
||||
};
|
||||
}
|
||||
set rowTransform(transformFunction) {
|
||||
if (!lodash_isfunction_1.default(transformFunction)) {
|
||||
throw new TypeError('The transform should be a function');
|
||||
}
|
||||
this._rowTransform = RowFormatter.createTransform(transformFunction);
|
||||
}
|
||||
format(row, cb) {
|
||||
this.callTransformer(row, (err, transformedRow) => {
|
||||
if (err) {
|
||||
return cb(err);
|
||||
}
|
||||
if (!row) {
|
||||
return cb(null);
|
||||
}
|
||||
const rows = [];
|
||||
if (transformedRow) {
|
||||
const { shouldFormatColumns, headers } = this.checkHeaders(transformedRow);
|
||||
if (this.shouldWriteHeaders && headers && !this.hasWrittenHeaders) {
|
||||
rows.push(this.formatColumns(headers, true));
|
||||
this.hasWrittenHeaders = true;
|
||||
}
|
||||
if (shouldFormatColumns) {
|
||||
const columns = this.gatherColumns(transformedRow);
|
||||
rows.push(this.formatColumns(columns, false));
|
||||
}
|
||||
}
|
||||
return cb(null, rows);
|
||||
});
|
||||
}
|
||||
finish(cb) {
|
||||
const rows = [];
|
||||
// check if we should write headers and we didnt get any rows
|
||||
if (this.formatterOptions.alwaysWriteHeaders && this.rowCount === 0) {
|
||||
if (!this.headers) {
|
||||
return cb(new Error('`alwaysWriteHeaders` option is set to true but `headers` option not provided.'));
|
||||
}
|
||||
rows.push(this.formatColumns(this.headers, true));
|
||||
}
|
||||
if (this.formatterOptions.includeEndRowDelimiter) {
|
||||
rows.push(this.formatterOptions.rowDelimiter);
|
||||
}
|
||||
return cb(null, rows);
|
||||
}
|
||||
// check if we need to write header return true if we should also write a row
|
||||
// could be false if headers is true and the header row(first item) is passed in
|
||||
checkHeaders(row) {
|
||||
if (this.headers) {
|
||||
// either the headers were provided by the user or we have already gathered them.
|
||||
return { shouldFormatColumns: true, headers: this.headers };
|
||||
}
|
||||
const headers = RowFormatter.gatherHeaders(row);
|
||||
this.headers = headers;
|
||||
this.fieldFormatter.headers = headers;
|
||||
if (!this.shouldWriteHeaders) {
|
||||
// if we are not supposed to write the headers then
|
||||
// always format the columns
|
||||
return { shouldFormatColumns: true, headers: null };
|
||||
}
|
||||
// if the row is equal to headers dont format
|
||||
return { shouldFormatColumns: !lodash_isequal_1.default(headers, row), headers };
|
||||
}
|
||||
// todo change this method to unknown[]
|
||||
gatherColumns(row) {
|
||||
if (this.headers === null) {
|
||||
throw new Error('Headers is currently null');
|
||||
}
|
||||
if (!Array.isArray(row)) {
|
||||
return this.headers.map((header) => row[header]);
|
||||
}
|
||||
if (RowFormatter.isRowHashArray(row)) {
|
||||
return this.headers.map((header, i) => {
|
||||
const col = row[i];
|
||||
if (col) {
|
||||
return col[1];
|
||||
}
|
||||
return '';
|
||||
});
|
||||
}
|
||||
// if its a one dimensional array and headers were not provided
|
||||
// then just return the row
|
||||
if (RowFormatter.isRowArray(row) && !this.shouldWriteHeaders) {
|
||||
return row;
|
||||
}
|
||||
return this.headers.map((header, i) => row[i]);
|
||||
}
|
||||
callTransformer(row, cb) {
|
||||
if (!this._rowTransform) {
|
||||
return cb(null, row);
|
||||
}
|
||||
return this._rowTransform(row, cb);
|
||||
}
|
||||
formatColumns(columns, isHeadersRow) {
|
||||
const formattedCols = columns
|
||||
.map((field, i) => this.fieldFormatter.format(field, i, isHeadersRow))
|
||||
.join(this.formatterOptions.delimiter);
|
||||
const { rowCount } = this;
|
||||
this.rowCount += 1;
|
||||
if (rowCount) {
|
||||
return [this.formatterOptions.rowDelimiter, formattedCols].join('');
|
||||
}
|
||||
return formattedCols;
|
||||
}
|
||||
}
|
||||
exports.RowFormatter = RowFormatter;
|
||||
//# sourceMappingURL=RowFormatter.js.map
|
||||
1
doc/test-data/purchase_transaction/node_modules/@fast-csv/format/build/src/formatter/RowFormatter.js.map
generated
vendored
Normal file
1
doc/test-data/purchase_transaction/node_modules/@fast-csv/format/build/src/formatter/RowFormatter.js.map
generated
vendored
Normal file
File diff suppressed because one or more lines are too long
2
doc/test-data/purchase_transaction/node_modules/@fast-csv/format/build/src/formatter/index.d.ts
generated
vendored
Normal file
2
doc/test-data/purchase_transaction/node_modules/@fast-csv/format/build/src/formatter/index.d.ts
generated
vendored
Normal file
@@ -0,0 +1,2 @@
|
||||
export { RowFormatter } from './RowFormatter';
|
||||
export { FieldFormatter } from './FieldFormatter';
|
||||
8
doc/test-data/purchase_transaction/node_modules/@fast-csv/format/build/src/formatter/index.js
generated
vendored
Normal file
8
doc/test-data/purchase_transaction/node_modules/@fast-csv/format/build/src/formatter/index.js
generated
vendored
Normal file
@@ -0,0 +1,8 @@
|
||||
"use strict";
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.FieldFormatter = exports.RowFormatter = void 0;
|
||||
var RowFormatter_1 = require("./RowFormatter");
|
||||
Object.defineProperty(exports, "RowFormatter", { enumerable: true, get: function () { return RowFormatter_1.RowFormatter; } });
|
||||
var FieldFormatter_1 = require("./FieldFormatter");
|
||||
Object.defineProperty(exports, "FieldFormatter", { enumerable: true, get: function () { return FieldFormatter_1.FieldFormatter; } });
|
||||
//# sourceMappingURL=index.js.map
|
||||
1
doc/test-data/purchase_transaction/node_modules/@fast-csv/format/build/src/formatter/index.js.map
generated
vendored
Normal file
1
doc/test-data/purchase_transaction/node_modules/@fast-csv/format/build/src/formatter/index.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"index.js","sourceRoot":"","sources":["../../../src/formatter/index.ts"],"names":[],"mappings":";;;AAAA,+CAA8C;AAArC,4GAAA,YAAY,OAAA;AACrB,mDAAkD;AAAzC,gHAAA,cAAc,OAAA"}
|
||||
14
doc/test-data/purchase_transaction/node_modules/@fast-csv/format/build/src/index.d.ts
generated
vendored
Normal file
14
doc/test-data/purchase_transaction/node_modules/@fast-csv/format/build/src/index.d.ts
generated
vendored
Normal file
@@ -0,0 +1,14 @@
|
||||
/// <reference types="node" />
|
||||
import * as fs from 'fs';
|
||||
import { Row } from './types';
|
||||
import { FormatterOptionsArgs } from './FormatterOptions';
|
||||
import { CsvFormatterStream } from './CsvFormatterStream';
|
||||
export * from './types';
|
||||
export { CsvFormatterStream } from './CsvFormatterStream';
|
||||
export { FormatterOptions, FormatterOptionsArgs } from './FormatterOptions';
|
||||
export declare const format: <I extends Row, O extends Row>(options?: FormatterOptionsArgs<I, O> | undefined) => CsvFormatterStream<I, O>;
|
||||
export declare const write: <I extends Row, O extends Row>(rows: I[], options?: FormatterOptionsArgs<I, O> | undefined) => CsvFormatterStream<I, O>;
|
||||
export declare const writeToStream: <T extends NodeJS.WritableStream, I extends Row, O extends Row>(ws: T, rows: I[], options?: FormatterOptionsArgs<I, O> | undefined) => T;
|
||||
export declare const writeToBuffer: <I extends Row, O extends Row>(rows: I[], opts?: FormatterOptionsArgs<I, O>) => Promise<Buffer>;
|
||||
export declare const writeToString: <I extends Row, O extends Row>(rows: I[], options?: FormatterOptionsArgs<I, O> | undefined) => Promise<string>;
|
||||
export declare const writeToPath: <I extends Row, O extends Row>(path: string, rows: I[], options?: FormatterOptionsArgs<I, O> | undefined) => fs.WriteStream;
|
||||
68
doc/test-data/purchase_transaction/node_modules/@fast-csv/format/build/src/index.js
generated
vendored
Normal file
68
doc/test-data/purchase_transaction/node_modules/@fast-csv/format/build/src/index.js
generated
vendored
Normal file
@@ -0,0 +1,68 @@
|
||||
"use strict";
|
||||
var __createBinding = (this && this.__createBinding) || (Object.create ? (function(o, m, k, k2) {
|
||||
if (k2 === undefined) k2 = k;
|
||||
Object.defineProperty(o, k2, { enumerable: true, get: function() { return m[k]; } });
|
||||
}) : (function(o, m, k, k2) {
|
||||
if (k2 === undefined) k2 = k;
|
||||
o[k2] = m[k];
|
||||
}));
|
||||
var __setModuleDefault = (this && this.__setModuleDefault) || (Object.create ? (function(o, v) {
|
||||
Object.defineProperty(o, "default", { enumerable: true, value: v });
|
||||
}) : function(o, v) {
|
||||
o["default"] = v;
|
||||
});
|
||||
var __importStar = (this && this.__importStar) || function (mod) {
|
||||
if (mod && mod.__esModule) return mod;
|
||||
var result = {};
|
||||
if (mod != null) for (var k in mod) if (k !== "default" && Object.prototype.hasOwnProperty.call(mod, k)) __createBinding(result, mod, k);
|
||||
__setModuleDefault(result, mod);
|
||||
return result;
|
||||
};
|
||||
var __exportStar = (this && this.__exportStar) || function(m, exports) {
|
||||
for (var p in m) if (p !== "default" && !Object.prototype.hasOwnProperty.call(exports, p)) __createBinding(exports, m, p);
|
||||
};
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.writeToPath = exports.writeToString = exports.writeToBuffer = exports.writeToStream = exports.write = exports.format = exports.FormatterOptions = exports.CsvFormatterStream = void 0;
|
||||
const util_1 = require("util");
|
||||
const stream_1 = require("stream");
|
||||
const fs = __importStar(require("fs"));
|
||||
const FormatterOptions_1 = require("./FormatterOptions");
|
||||
const CsvFormatterStream_1 = require("./CsvFormatterStream");
|
||||
__exportStar(require("./types"), exports);
|
||||
var CsvFormatterStream_2 = require("./CsvFormatterStream");
|
||||
Object.defineProperty(exports, "CsvFormatterStream", { enumerable: true, get: function () { return CsvFormatterStream_2.CsvFormatterStream; } });
|
||||
var FormatterOptions_2 = require("./FormatterOptions");
|
||||
Object.defineProperty(exports, "FormatterOptions", { enumerable: true, get: function () { return FormatterOptions_2.FormatterOptions; } });
|
||||
exports.format = (options) => new CsvFormatterStream_1.CsvFormatterStream(new FormatterOptions_1.FormatterOptions(options));
|
||||
exports.write = (rows, options) => {
|
||||
const csvStream = exports.format(options);
|
||||
const promiseWrite = util_1.promisify((row, cb) => {
|
||||
csvStream.write(row, undefined, cb);
|
||||
});
|
||||
rows.reduce((prev, row) => prev.then(() => promiseWrite(row)), Promise.resolve())
|
||||
.then(() => csvStream.end())
|
||||
.catch((err) => {
|
||||
csvStream.emit('error', err);
|
||||
});
|
||||
return csvStream;
|
||||
};
|
||||
exports.writeToStream = (ws, rows, options) => exports.write(rows, options).pipe(ws);
|
||||
exports.writeToBuffer = (rows, opts = {}) => {
|
||||
const buffers = [];
|
||||
const ws = new stream_1.Writable({
|
||||
write(data, enc, writeCb) {
|
||||
buffers.push(data);
|
||||
writeCb();
|
||||
},
|
||||
});
|
||||
return new Promise((res, rej) => {
|
||||
ws.on('error', rej).on('finish', () => res(Buffer.concat(buffers)));
|
||||
exports.write(rows, opts).pipe(ws);
|
||||
});
|
||||
};
|
||||
exports.writeToString = (rows, options) => exports.writeToBuffer(rows, options).then((buffer) => buffer.toString());
|
||||
exports.writeToPath = (path, rows, options) => {
|
||||
const stream = fs.createWriteStream(path, { encoding: 'utf8' });
|
||||
return exports.write(rows, options).pipe(stream);
|
||||
};
|
||||
//# sourceMappingURL=index.js.map
|
||||
1
doc/test-data/purchase_transaction/node_modules/@fast-csv/format/build/src/index.js.map
generated
vendored
Normal file
1
doc/test-data/purchase_transaction/node_modules/@fast-csv/format/build/src/index.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"index.js","sourceRoot":"","sources":["../../src/index.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;AAAA,+BAAiC;AACjC,mCAAkC;AAClC,uCAAyB;AAEzB,yDAA4E;AAC5E,6DAA0D;AAE1D,0CAAwB;AACxB,2DAA0D;AAAjD,wHAAA,kBAAkB,OAAA;AAC3B,uDAA4E;AAAnE,oHAAA,gBAAgB,OAAA;AAEZ,QAAA,MAAM,GAAG,CAA+B,OAAoC,EAA4B,EAAE,CACnH,IAAI,uCAAkB,CAAC,IAAI,mCAAgB,CAAC,OAAO,CAAC,CAAC,CAAC;AAE7C,QAAA,KAAK,GAAG,CACjB,IAAS,EACT,OAAoC,EACZ,EAAE;IAC1B,MAAM,SAAS,GAAG,cAAM,CAAC,OAAO,CAAC,CAAC;IAClC,MAAM,YAAY,GAAG,gBAAS,CAAC,CAAC,GAAM,EAAE,EAAkC,EAAQ,EAAE;QAChF,SAAS,CAAC,KAAK,CAAC,GAAG,EAAE,SAAS,EAAE,EAAE,CAAC,CAAC;IACxC,CAAC,CAAC,CAAC;IACH,IAAI,CAAC,MAAM,CACP,CAAC,IAAmB,EAAE,GAAM,EAAiB,EAAE,CAAC,IAAI,CAAC,IAAI,CAAC,GAAkB,EAAE,CAAC,YAAY,CAAC,GAAG,CAAC,CAAC,EACjG,OAAO,CAAC,OAAO,EAAE,CACpB;SACI,IAAI,CAAC,GAAS,EAAE,CAAC,SAAS,CAAC,GAAG,EAAE,CAAC;SACjC,KAAK,CAAC,CAAC,GAAG,EAAQ,EAAE;QACjB,SAAS,CAAC,IAAI,CAAC,OAAO,EAAE,GAAG,CAAC,CAAC;IACjC,CAAC,CAAC,CAAC;IACP,OAAO,SAAS,CAAC;AACrB,CAAC,CAAC;AAEW,QAAA,aAAa,GAAG,CACzB,EAAK,EACL,IAAS,EACT,OAAoC,EACnC,EAAE,CAAC,aAAK,CAAC,IAAI,EAAE,OAAO,CAAC,CAAC,IAAI,CAAC,EAAE,CAAC,CAAC;AAEzB,QAAA,aAAa,GAAG,CACzB,IAAS,EACT,OAAmC,EAAE,EACtB,EAAE;IACjB,MAAM,OAAO,GAAa,EAAE,CAAC;IAC7B,MAAM,EAAE,GAAG,IAAI,iBAAQ,CAAC;QACpB,KAAK,CAAC,IAAI,EAAE,GAAG,EAAE,OAAO;YACpB,OAAO,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC;YACnB,OAAO,EAAE,CAAC;QACd,CAAC;KACJ,CAAC,CAAC;IACH,OAAO,IAAI,OAAO,CAAC,CAAC,GAAG,EAAE,GAAG,EAAQ,EAAE;QAClC,EAAE,CAAC,EAAE,CAAC,OAAO,EAAE,GAAG,CAAC,CAAC,EAAE,CAAC,QAAQ,EAAE,GAAS,EAAE,CAAC,GAAG,CAAC,MAAM,CAAC,MAAM,CAAC,OAAO,CAAC,CAAC,CAAC,CAAC;QAC1E,aAAK,CAAC,IAAI,EAAE,IAAI,CAAC,CAAC,IAAI,CAAC,EAAE,CAAC,CAAC;IAC/B,CAAC,CAAC,CAAC;AACP,CAAC,CAAC;AAEW,QAAA,aAAa,GAAG,CACzB,IAAS,EACT,OAAoC,EACrB,EAAE,CAAC,qBAAa,CAAC,IAAI,EAAE,OAAO,CAAC,CAAC,IAAI,CAAC,CAAC,MAAM,EAAU,EAAE,CAAC,MAAM,CAAC,QAAQ,EAAE,CAAC,CAAC;AAElF,QAAA,WAAW,GAAG,CACvB,IAAY,EACZ,IAAS,EACT,OAAoC,EACtB,EAAE;IAChB,MAAM,MAAM,GAAG,EAAE,CAAC,iBAAiB,CAAC,IAAI,EAAE,EAAE,QAAQ,EAAE,MAAM,EAAE,CAAC,CAAC;IAChE,OAAO,aAAK,CAAC,IAAI,EAAE,OAAO,CAAC,CAAC,IAAI,CAAC,MAAM,CAAC,CAAC;AAC7C,CAAC,CAAC"}
|
||||
9
doc/test-data/purchase_transaction/node_modules/@fast-csv/format/build/src/types.d.ts
generated
vendored
Normal file
9
doc/test-data/purchase_transaction/node_modules/@fast-csv/format/build/src/types.d.ts
generated
vendored
Normal file
@@ -0,0 +1,9 @@
|
||||
export declare type RowMap<V = any> = Record<string, V>;
|
||||
export declare type RowHashArray<V = any> = [string, V][];
|
||||
export declare type RowArray = string[];
|
||||
export declare type Row = RowArray | RowHashArray | RowMap;
|
||||
export declare type RowTransformCallback<R extends Row> = (error?: Error | null, row?: R) => void;
|
||||
export declare type SyncRowTransform<I extends Row, O extends Row> = (row: I) => O;
|
||||
export declare type AsyncRowTransform<I extends Row, O extends Row> = (row: I, cb: RowTransformCallback<O>) => void;
|
||||
export declare type RowTransformFunction<I extends Row, O extends Row> = SyncRowTransform<I, O> | AsyncRowTransform<I, O>;
|
||||
export declare const isSyncTransform: <I extends Row, O extends Row>(transform: RowTransformFunction<I, O>) => transform is SyncRowTransform<I, O>;
|
||||
6
doc/test-data/purchase_transaction/node_modules/@fast-csv/format/build/src/types.js
generated
vendored
Normal file
6
doc/test-data/purchase_transaction/node_modules/@fast-csv/format/build/src/types.js
generated
vendored
Normal file
@@ -0,0 +1,6 @@
|
||||
"use strict";
|
||||
/* eslint-disable @typescript-eslint/no-explicit-any */
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.isSyncTransform = void 0;
|
||||
exports.isSyncTransform = (transform) => transform.length === 1;
|
||||
//# sourceMappingURL=types.js.map
|
||||
1
doc/test-data/purchase_transaction/node_modules/@fast-csv/format/build/src/types.js.map
generated
vendored
Normal file
1
doc/test-data/purchase_transaction/node_modules/@fast-csv/format/build/src/types.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"types.js","sourceRoot":"","sources":["../../src/types.ts"],"names":[],"mappings":";AAAA,uDAAuD;;;AAY1C,QAAA,eAAe,GAAG,CAC3B,SAAqC,EACF,EAAE,CAAC,SAAS,CAAC,MAAM,KAAK,CAAC,CAAC"}
|
||||
55
doc/test-data/purchase_transaction/node_modules/@fast-csv/format/package.json
generated
vendored
Normal file
55
doc/test-data/purchase_transaction/node_modules/@fast-csv/format/package.json
generated
vendored
Normal file
@@ -0,0 +1,55 @@
|
||||
{
|
||||
"name": "@fast-csv/format",
|
||||
"version": "4.3.5",
|
||||
"description": "fast-csv formatting module",
|
||||
"keywords": [
|
||||
"csv",
|
||||
"format",
|
||||
"write"
|
||||
],
|
||||
"author": "doug-martin <doug@dougamartin.com>",
|
||||
"homepage": "http://c2fo.github.com/fast-csv/packages/format",
|
||||
"license": "MIT",
|
||||
"main": "build/src/index.js",
|
||||
"types": "build/src/index.d.ts",
|
||||
"directories": {
|
||||
"lib": "src",
|
||||
"test": "__tests__"
|
||||
},
|
||||
"files": [
|
||||
"build/src/**"
|
||||
],
|
||||
"publishConfig": {
|
||||
"access": "public"
|
||||
},
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "git+https://github.com/C2FO/fast-csv.git",
|
||||
"directory": "packages/format"
|
||||
},
|
||||
"scripts": {
|
||||
"prepublishOnly": "npm run build",
|
||||
"build": "npm run clean && npm run compile",
|
||||
"clean": "rm -rf ./build && rm -rf tsconfig.tsbuildinfo",
|
||||
"compile": "tsc"
|
||||
},
|
||||
"bugs": {
|
||||
"url": "https://github.com/C2FO/fast-csv/issues"
|
||||
},
|
||||
"dependencies": {
|
||||
"@types/node": "^14.0.1",
|
||||
"lodash.escaperegexp": "^4.1.2",
|
||||
"lodash.isboolean": "^3.0.3",
|
||||
"lodash.isequal": "^4.5.0",
|
||||
"lodash.isfunction": "^3.0.9",
|
||||
"lodash.isnil": "^4.0.0"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@types/lodash.escaperegexp": "4.1.6",
|
||||
"@types/lodash.isboolean": "3.0.6",
|
||||
"@types/lodash.isequal": "4.5.5",
|
||||
"@types/lodash.isfunction": "3.0.6",
|
||||
"@types/lodash.isnil": "4.0.6"
|
||||
},
|
||||
"gitHead": "b908170cb49398ae12847d050af5c8e5b0dc812f"
|
||||
}
|
||||
87
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/CHANGELOG.md
generated
vendored
Normal file
87
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/CHANGELOG.md
generated
vendored
Normal file
@@ -0,0 +1,87 @@
|
||||
# Change Log
|
||||
|
||||
All notable changes to this project will be documented in this file.
|
||||
See [Conventional Commits](https://conventionalcommits.org) for commit guidelines.
|
||||
|
||||
## [4.3.6](https://github.com/C2FO/fast-csv/compare/v4.3.5...v4.3.6) (2020-12-04)
|
||||
|
||||
|
||||
### Bug Fixes
|
||||
|
||||
* Simplify empty row check by removing complex regex ([4bbd39f](https://github.com/C2FO/fast-csv/commit/4bbd39f26a8cd7382151ab4f5fb102234b2f829e))
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
## [4.3.3](https://github.com/C2FO/fast-csv/compare/v4.3.2...v4.3.3) (2020-10-30)
|
||||
|
||||
**Note:** Version bump only for package @fast-csv/parse
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
## [4.3.2](https://github.com/C2FO/fast-csv/compare/v4.3.1...v4.3.2) (2020-09-02)
|
||||
|
||||
|
||||
### Bug Fixes
|
||||
|
||||
* **parsing, #423:** Prevent callback from being called multiple times ([040febe](https://github.com/C2FO/fast-csv/commit/040febe17f5fe763a00f45b1d83c5acd47bbbe0b)), closes [#423](https://github.com/C2FO/fast-csv/issues/423)
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
## [4.3.1](https://github.com/C2FO/fast-csv/compare/v4.3.0...v4.3.1) (2020-06-23)
|
||||
|
||||
|
||||
### Bug Fixes
|
||||
|
||||
* **parsing:** Pass errors through callbacks ([84ecdf6](https://github.com/C2FO/fast-csv/commit/84ecdf6ed18b15d68b4ed3e2bfec7eb41b438ad8))
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
# [4.3.0](https://github.com/C2FO/fast-csv/compare/v4.2.0...v4.3.0) (2020-05-27)
|
||||
|
||||
**Note:** Version bump only for package @fast-csv/parse
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
# [4.2.0](https://github.com/C2FO/fast-csv/compare/v4.1.6...v4.2.0) (2020-05-19)
|
||||
|
||||
|
||||
### Features
|
||||
|
||||
* **parsing:** Less restrictive row parsing type [#356](https://github.com/C2FO/fast-csv/issues/356) ([87d74ec](https://github.com/C2FO/fast-csv/commit/87d74ecd2cb16f3700b1942ebbbec221afe38790))
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
## [4.1.6](https://github.com/C2FO/fast-csv/compare/v4.1.5...v4.1.6) (2020-05-15)
|
||||
|
||||
|
||||
### Bug Fixes
|
||||
|
||||
* **parse:** Handle escaped escape properly [#340](https://github.com/C2FO/fast-csv/issues/340) ([78d9b16](https://github.com/C2FO/fast-csv/commit/78d9b160152ee399f31086cc6b5f66a7ca7f9e24))
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
## [4.1.5](https://github.com/C2FO/fast-csv/compare/v4.1.4...v4.1.5) (2020-05-15)
|
||||
|
||||
**Note:** Version bump only for package @fast-csv/parse
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
## [4.1.4](https://github.com/C2FO/fast-csv/compare/v4.1.3...v4.1.4) (2020-05-15)
|
||||
|
||||
**Note:** Version bump only for package @fast-csv/parse
|
||||
21
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/LICENSE
generated
vendored
Normal file
21
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/LICENSE
generated
vendored
Normal file
@@ -0,0 +1,21 @@
|
||||
The MIT License
|
||||
|
||||
Copyright (c) 2011-2019 C2FO
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in
|
||||
all copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||
THE SOFTWARE.
|
||||
20
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/README.md
generated
vendored
Normal file
20
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/README.md
generated
vendored
Normal file
@@ -0,0 +1,20 @@
|
||||
<p align="center">
|
||||
<a href="https://c2fo.io/fast-csv" target="blank"><img src="https://c2fo.io/fast-csv/img/logo.svg" width="200" alt="fast-csv Logo" /></a>
|
||||
</p>
|
||||
|
||||
[](https://www.npmjs.org/package/@fast-csv/parse)
|
||||
[](https://travis-ci.org/C2FO/fast-csv)
|
||||
[](https://coveralls.io/github/C2FO/fast-csv?branch=master)
|
||||
[](https://snyk.io/test/github/C2FO/fast-csv?targetFile=packages/parse/package.json)
|
||||
|
||||
# `@fast-csv/parse`
|
||||
|
||||
`fast-csv` package to parse CSVs.
|
||||
|
||||
## Installation
|
||||
|
||||
[Install Guide](https://c2fo.io/fast-csv/docs/introduction/install)
|
||||
|
||||
## Usage
|
||||
|
||||
To get started with `@fast-csv/parse` [check out the docs](https://c2fo.io/fast-csv/docs/parsing/getting-started)
|
||||
33
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/CsvParserStream.d.ts
generated
vendored
Normal file
33
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/CsvParserStream.d.ts
generated
vendored
Normal file
@@ -0,0 +1,33 @@
|
||||
/// <reference types="node" />
|
||||
import { Transform, TransformCallback } from 'stream';
|
||||
import { ParserOptions } from './ParserOptions';
|
||||
import { Row, RowTransformFunction, RowValidate } from './types';
|
||||
export declare class CsvParserStream<I extends Row, O extends Row> extends Transform {
|
||||
private readonly parserOptions;
|
||||
private readonly decoder;
|
||||
private readonly parser;
|
||||
private readonly headerTransformer;
|
||||
private readonly rowTransformerValidator;
|
||||
private lines;
|
||||
private rowCount;
|
||||
private parsedRowCount;
|
||||
private parsedLineCount;
|
||||
private endEmitted;
|
||||
private headersEmitted;
|
||||
constructor(parserOptions: ParserOptions);
|
||||
private get hasHitRowLimit();
|
||||
private get shouldEmitRows();
|
||||
private get shouldSkipLine();
|
||||
transform(transformFunction: RowTransformFunction<I, O>): CsvParserStream<I, O>;
|
||||
validate(validateFunction: RowValidate<O>): CsvParserStream<I, O>;
|
||||
emit(event: string | symbol, ...rest: any[]): boolean;
|
||||
_transform(data: Buffer, encoding: string, done: TransformCallback): void;
|
||||
_flush(done: TransformCallback): void;
|
||||
private parse;
|
||||
private processRows;
|
||||
private transformRow;
|
||||
private checkAndEmitHeaders;
|
||||
private skipRow;
|
||||
private pushRow;
|
||||
private static wrapDoneCallback;
|
||||
}
|
||||
212
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/CsvParserStream.js
generated
vendored
Normal file
212
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/CsvParserStream.js
generated
vendored
Normal file
@@ -0,0 +1,212 @@
|
||||
"use strict";
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.CsvParserStream = void 0;
|
||||
const string_decoder_1 = require("string_decoder");
|
||||
const stream_1 = require("stream");
|
||||
const transforms_1 = require("./transforms");
|
||||
const parser_1 = require("./parser");
|
||||
class CsvParserStream extends stream_1.Transform {
|
||||
constructor(parserOptions) {
|
||||
super({ objectMode: parserOptions.objectMode });
|
||||
this.lines = '';
|
||||
this.rowCount = 0;
|
||||
this.parsedRowCount = 0;
|
||||
this.parsedLineCount = 0;
|
||||
this.endEmitted = false;
|
||||
this.headersEmitted = false;
|
||||
this.parserOptions = parserOptions;
|
||||
this.parser = new parser_1.Parser(parserOptions);
|
||||
this.headerTransformer = new transforms_1.HeaderTransformer(parserOptions);
|
||||
this.decoder = new string_decoder_1.StringDecoder(parserOptions.encoding);
|
||||
this.rowTransformerValidator = new transforms_1.RowTransformerValidator();
|
||||
}
|
||||
get hasHitRowLimit() {
|
||||
return this.parserOptions.limitRows && this.rowCount >= this.parserOptions.maxRows;
|
||||
}
|
||||
get shouldEmitRows() {
|
||||
return this.parsedRowCount > this.parserOptions.skipRows;
|
||||
}
|
||||
get shouldSkipLine() {
|
||||
return this.parsedLineCount <= this.parserOptions.skipLines;
|
||||
}
|
||||
transform(transformFunction) {
|
||||
this.rowTransformerValidator.rowTransform = transformFunction;
|
||||
return this;
|
||||
}
|
||||
validate(validateFunction) {
|
||||
this.rowTransformerValidator.rowValidator = validateFunction;
|
||||
return this;
|
||||
}
|
||||
// eslint-disable-next-line @typescript-eslint/no-explicit-any
|
||||
emit(event, ...rest) {
|
||||
if (event === 'end') {
|
||||
if (!this.endEmitted) {
|
||||
this.endEmitted = true;
|
||||
super.emit('end', this.rowCount);
|
||||
}
|
||||
return false;
|
||||
}
|
||||
return super.emit(event, ...rest);
|
||||
}
|
||||
_transform(data, encoding, done) {
|
||||
// if we have hit our maxRows parsing limit then skip parsing
|
||||
if (this.hasHitRowLimit) {
|
||||
return done();
|
||||
}
|
||||
const wrappedCallback = CsvParserStream.wrapDoneCallback(done);
|
||||
try {
|
||||
const { lines } = this;
|
||||
const newLine = lines + this.decoder.write(data);
|
||||
const rows = this.parse(newLine, true);
|
||||
return this.processRows(rows, wrappedCallback);
|
||||
}
|
||||
catch (e) {
|
||||
return wrappedCallback(e);
|
||||
}
|
||||
}
|
||||
_flush(done) {
|
||||
const wrappedCallback = CsvParserStream.wrapDoneCallback(done);
|
||||
// if we have hit our maxRows parsing limit then skip parsing
|
||||
if (this.hasHitRowLimit) {
|
||||
return wrappedCallback();
|
||||
}
|
||||
try {
|
||||
const newLine = this.lines + this.decoder.end();
|
||||
const rows = this.parse(newLine, false);
|
||||
return this.processRows(rows, wrappedCallback);
|
||||
}
|
||||
catch (e) {
|
||||
return wrappedCallback(e);
|
||||
}
|
||||
}
|
||||
parse(data, hasMoreData) {
|
||||
if (!data) {
|
||||
return [];
|
||||
}
|
||||
const { line, rows } = this.parser.parse(data, hasMoreData);
|
||||
this.lines = line;
|
||||
return rows;
|
||||
}
|
||||
processRows(rows, cb) {
|
||||
const rowsLength = rows.length;
|
||||
const iterate = (i) => {
|
||||
const callNext = (err) => {
|
||||
if (err) {
|
||||
return cb(err);
|
||||
}
|
||||
if (i % 100 === 0) {
|
||||
// incase the transform are sync insert a next tick to prevent stack overflow
|
||||
setImmediate(() => iterate(i + 1));
|
||||
return undefined;
|
||||
}
|
||||
return iterate(i + 1);
|
||||
};
|
||||
this.checkAndEmitHeaders();
|
||||
// if we have emitted all rows or we have hit the maxRows limit option
|
||||
// then end
|
||||
if (i >= rowsLength || this.hasHitRowLimit) {
|
||||
return cb();
|
||||
}
|
||||
this.parsedLineCount += 1;
|
||||
if (this.shouldSkipLine) {
|
||||
return callNext();
|
||||
}
|
||||
const row = rows[i];
|
||||
this.rowCount += 1;
|
||||
this.parsedRowCount += 1;
|
||||
const nextRowCount = this.rowCount;
|
||||
return this.transformRow(row, (err, transformResult) => {
|
||||
if (err) {
|
||||
this.rowCount -= 1;
|
||||
return callNext(err);
|
||||
}
|
||||
if (!transformResult) {
|
||||
return callNext(new Error('expected transform result'));
|
||||
}
|
||||
if (!transformResult.isValid) {
|
||||
this.emit('data-invalid', transformResult.row, nextRowCount, transformResult.reason);
|
||||
}
|
||||
else if (transformResult.row) {
|
||||
return this.pushRow(transformResult.row, callNext);
|
||||
}
|
||||
return callNext();
|
||||
});
|
||||
};
|
||||
iterate(0);
|
||||
}
|
||||
transformRow(parsedRow, cb) {
|
||||
try {
|
||||
this.headerTransformer.transform(parsedRow, (err, withHeaders) => {
|
||||
if (err) {
|
||||
return cb(err);
|
||||
}
|
||||
if (!withHeaders) {
|
||||
return cb(new Error('Expected result from header transform'));
|
||||
}
|
||||
if (!withHeaders.isValid) {
|
||||
if (this.shouldEmitRows) {
|
||||
return cb(null, { isValid: false, row: parsedRow });
|
||||
}
|
||||
// skipped because of skipRows option remove from total row count
|
||||
return this.skipRow(cb);
|
||||
}
|
||||
if (withHeaders.row) {
|
||||
if (this.shouldEmitRows) {
|
||||
return this.rowTransformerValidator.transformAndValidate(withHeaders.row, cb);
|
||||
}
|
||||
// skipped because of skipRows option remove from total row count
|
||||
return this.skipRow(cb);
|
||||
}
|
||||
// this is a header row dont include in the rowCount or parsedRowCount
|
||||
this.rowCount -= 1;
|
||||
this.parsedRowCount -= 1;
|
||||
return cb(null, { row: null, isValid: true });
|
||||
});
|
||||
}
|
||||
catch (e) {
|
||||
cb(e);
|
||||
}
|
||||
}
|
||||
checkAndEmitHeaders() {
|
||||
if (!this.headersEmitted && this.headerTransformer.headers) {
|
||||
this.headersEmitted = true;
|
||||
this.emit('headers', this.headerTransformer.headers);
|
||||
}
|
||||
}
|
||||
skipRow(cb) {
|
||||
// skipped because of skipRows option remove from total row count
|
||||
this.rowCount -= 1;
|
||||
return cb(null, { row: null, isValid: true });
|
||||
}
|
||||
pushRow(row, cb) {
|
||||
try {
|
||||
if (!this.parserOptions.objectMode) {
|
||||
this.push(JSON.stringify(row));
|
||||
}
|
||||
else {
|
||||
this.push(row);
|
||||
}
|
||||
cb();
|
||||
}
|
||||
catch (e) {
|
||||
cb(e);
|
||||
}
|
||||
}
|
||||
static wrapDoneCallback(done) {
|
||||
let errorCalled = false;
|
||||
// eslint-disable-next-line @typescript-eslint/no-explicit-any
|
||||
return (err, ...args) => {
|
||||
if (err) {
|
||||
if (errorCalled) {
|
||||
throw err;
|
||||
}
|
||||
errorCalled = true;
|
||||
done(err);
|
||||
return;
|
||||
}
|
||||
done(...args);
|
||||
};
|
||||
}
|
||||
}
|
||||
exports.CsvParserStream = CsvParserStream;
|
||||
//# sourceMappingURL=CsvParserStream.js.map
|
||||
1
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/CsvParserStream.js.map
generated
vendored
Normal file
1
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/CsvParserStream.js.map
generated
vendored
Normal file
File diff suppressed because one or more lines are too long
47
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/ParserOptions.d.ts
generated
vendored
Normal file
47
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/ParserOptions.d.ts
generated
vendored
Normal file
@@ -0,0 +1,47 @@
|
||||
/// <reference types="node" />
|
||||
import { HeaderArray, HeaderTransformFunction } from './types';
|
||||
export interface ParserOptionsArgs {
|
||||
objectMode?: boolean;
|
||||
delimiter?: string;
|
||||
quote?: string | null;
|
||||
escape?: string;
|
||||
headers?: boolean | HeaderTransformFunction | HeaderArray;
|
||||
renameHeaders?: boolean;
|
||||
ignoreEmpty?: boolean;
|
||||
comment?: string;
|
||||
strictColumnHandling?: boolean;
|
||||
discardUnmappedColumns?: boolean;
|
||||
trim?: boolean;
|
||||
ltrim?: boolean;
|
||||
rtrim?: boolean;
|
||||
encoding?: string;
|
||||
maxRows?: number;
|
||||
skipLines?: number;
|
||||
skipRows?: number;
|
||||
}
|
||||
export declare class ParserOptions {
|
||||
readonly escapedDelimiter: string;
|
||||
readonly objectMode: boolean;
|
||||
readonly delimiter: string;
|
||||
readonly ignoreEmpty: boolean;
|
||||
readonly quote: string | null;
|
||||
readonly escape: string | null;
|
||||
readonly escapeChar: string | null;
|
||||
readonly comment: string | null;
|
||||
readonly supportsComments: boolean;
|
||||
readonly ltrim: boolean;
|
||||
readonly rtrim: boolean;
|
||||
readonly trim: boolean;
|
||||
readonly headers: boolean | HeaderTransformFunction | HeaderArray | null;
|
||||
readonly renameHeaders: boolean;
|
||||
readonly strictColumnHandling: boolean;
|
||||
readonly discardUnmappedColumns: boolean;
|
||||
readonly carriageReturn: string;
|
||||
readonly NEXT_TOKEN_REGEXP: RegExp;
|
||||
readonly encoding: BufferEncoding;
|
||||
readonly limitRows: boolean;
|
||||
readonly maxRows: number;
|
||||
readonly skipLines: number;
|
||||
readonly skipRows: number;
|
||||
constructor(opts?: ParserOptionsArgs);
|
||||
}
|
||||
47
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/ParserOptions.js
generated
vendored
Normal file
47
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/ParserOptions.js
generated
vendored
Normal file
@@ -0,0 +1,47 @@
|
||||
"use strict";
|
||||
var __importDefault = (this && this.__importDefault) || function (mod) {
|
||||
return (mod && mod.__esModule) ? mod : { "default": mod };
|
||||
};
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.ParserOptions = void 0;
|
||||
const lodash_escaperegexp_1 = __importDefault(require("lodash.escaperegexp"));
|
||||
const lodash_isnil_1 = __importDefault(require("lodash.isnil"));
|
||||
class ParserOptions {
|
||||
constructor(opts) {
|
||||
var _a;
|
||||
this.objectMode = true;
|
||||
this.delimiter = ',';
|
||||
this.ignoreEmpty = false;
|
||||
this.quote = '"';
|
||||
this.escape = null;
|
||||
this.escapeChar = this.quote;
|
||||
this.comment = null;
|
||||
this.supportsComments = false;
|
||||
this.ltrim = false;
|
||||
this.rtrim = false;
|
||||
this.trim = false;
|
||||
this.headers = null;
|
||||
this.renameHeaders = false;
|
||||
this.strictColumnHandling = false;
|
||||
this.discardUnmappedColumns = false;
|
||||
this.carriageReturn = '\r';
|
||||
this.encoding = 'utf8';
|
||||
this.limitRows = false;
|
||||
this.maxRows = 0;
|
||||
this.skipLines = 0;
|
||||
this.skipRows = 0;
|
||||
Object.assign(this, opts || {});
|
||||
if (this.delimiter.length > 1) {
|
||||
throw new Error('delimiter option must be one character long');
|
||||
}
|
||||
this.escapedDelimiter = lodash_escaperegexp_1.default(this.delimiter);
|
||||
this.escapeChar = (_a = this.escape) !== null && _a !== void 0 ? _a : this.quote;
|
||||
this.supportsComments = !lodash_isnil_1.default(this.comment);
|
||||
this.NEXT_TOKEN_REGEXP = new RegExp(`([^\\s]|\\r\\n|\\n|\\r|${this.escapedDelimiter})`);
|
||||
if (this.maxRows > 0) {
|
||||
this.limitRows = true;
|
||||
}
|
||||
}
|
||||
}
|
||||
exports.ParserOptions = ParserOptions;
|
||||
//# sourceMappingURL=ParserOptions.js.map
|
||||
1
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/ParserOptions.js.map
generated
vendored
Normal file
1
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/ParserOptions.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"ParserOptions.js","sourceRoot":"","sources":["../../src/ParserOptions.ts"],"names":[],"mappings":";;;;;;AAAA,8EAA+C;AAC/C,gEAAiC;AAuBjC,MAAa,aAAa;IA+CtB,YAAmB,IAAwB;;QA5C3B,eAAU,GAAY,IAAI,CAAC;QAE3B,cAAS,GAAW,GAAG,CAAC;QAExB,gBAAW,GAAY,KAAK,CAAC;QAE7B,UAAK,GAAkB,GAAG,CAAC;QAE3B,WAAM,GAAkB,IAAI,CAAC;QAE7B,eAAU,GAAkB,IAAI,CAAC,KAAK,CAAC;QAEvC,YAAO,GAAkB,IAAI,CAAC;QAE9B,qBAAgB,GAAY,KAAK,CAAC;QAElC,UAAK,GAAY,KAAK,CAAC;QAEvB,UAAK,GAAY,KAAK,CAAC;QAEvB,SAAI,GAAY,KAAK,CAAC;QAEtB,YAAO,GAA2D,IAAI,CAAC;QAEvE,kBAAa,GAAY,KAAK,CAAC;QAE/B,yBAAoB,GAAY,KAAK,CAAC;QAEtC,2BAAsB,GAAY,KAAK,CAAC;QAExC,mBAAc,GAAW,IAAI,CAAC;QAI9B,aAAQ,GAAmB,MAAM,CAAC;QAElC,cAAS,GAAY,KAAK,CAAC;QAE3B,YAAO,GAAW,CAAC,CAAC;QAEpB,cAAS,GAAW,CAAC,CAAC;QAEtB,aAAQ,GAAW,CAAC,CAAC;QAGjC,MAAM,CAAC,MAAM,CAAC,IAAI,EAAE,IAAI,IAAI,EAAE,CAAC,CAAC;QAChC,IAAI,IAAI,CAAC,SAAS,CAAC,MAAM,GAAG,CAAC,EAAE;YAC3B,MAAM,IAAI,KAAK,CAAC,6CAA6C,CAAC,CAAC;SAClE;QACD,IAAI,CAAC,gBAAgB,GAAG,6BAAY,CAAC,IAAI,CAAC,SAAS,CAAC,CAAC;QACrD,IAAI,CAAC,UAAU,SAAG,IAAI,CAAC,MAAM,mCAAI,IAAI,CAAC,KAAK,CAAC;QAC5C,IAAI,CAAC,gBAAgB,GAAG,CAAC,sBAAK,CAAC,IAAI,CAAC,OAAO,CAAC,CAAC;QAC7C,IAAI,CAAC,iBAAiB,GAAG,IAAI,MAAM,CAAC,0BAA0B,IAAI,CAAC,gBAAgB,GAAG,CAAC,CAAC;QAExF,IAAI,IAAI,CAAC,OAAO,GAAG,CAAC,EAAE;YAClB,IAAI,CAAC,SAAS,GAAG,IAAI,CAAC;SACzB;IACL,CAAC;CACJ;AA7DD,sCA6DC"}
|
||||
11
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/index.d.ts
generated
vendored
Normal file
11
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/index.d.ts
generated
vendored
Normal file
@@ -0,0 +1,11 @@
|
||||
/// <reference types="node" />
|
||||
import { ParserOptionsArgs } from './ParserOptions';
|
||||
import { CsvParserStream } from './CsvParserStream';
|
||||
import { Row } from './types';
|
||||
export * from './types';
|
||||
export { CsvParserStream } from './CsvParserStream';
|
||||
export { ParserOptions, ParserOptionsArgs } from './ParserOptions';
|
||||
export declare const parse: <I extends Row<any>, O extends Row<any>>(args?: ParserOptionsArgs | undefined) => CsvParserStream<I, O>;
|
||||
export declare const parseStream: <I extends Row<any>, O extends Row<any>>(stream: NodeJS.ReadableStream, options?: ParserOptionsArgs | undefined) => CsvParserStream<I, O>;
|
||||
export declare const parseFile: <I extends Row<any>, O extends Row<any>>(location: string, options?: ParserOptionsArgs) => CsvParserStream<I, O>;
|
||||
export declare const parseString: <I extends Row<any>, O extends Row<any>>(string: string, options?: ParserOptionsArgs | undefined) => CsvParserStream<I, O>;
|
||||
44
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/index.js
generated
vendored
Normal file
44
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/index.js
generated
vendored
Normal file
@@ -0,0 +1,44 @@
|
||||
"use strict";
|
||||
var __createBinding = (this && this.__createBinding) || (Object.create ? (function(o, m, k, k2) {
|
||||
if (k2 === undefined) k2 = k;
|
||||
Object.defineProperty(o, k2, { enumerable: true, get: function() { return m[k]; } });
|
||||
}) : (function(o, m, k, k2) {
|
||||
if (k2 === undefined) k2 = k;
|
||||
o[k2] = m[k];
|
||||
}));
|
||||
var __setModuleDefault = (this && this.__setModuleDefault) || (Object.create ? (function(o, v) {
|
||||
Object.defineProperty(o, "default", { enumerable: true, value: v });
|
||||
}) : function(o, v) {
|
||||
o["default"] = v;
|
||||
});
|
||||
var __importStar = (this && this.__importStar) || function (mod) {
|
||||
if (mod && mod.__esModule) return mod;
|
||||
var result = {};
|
||||
if (mod != null) for (var k in mod) if (k !== "default" && Object.prototype.hasOwnProperty.call(mod, k)) __createBinding(result, mod, k);
|
||||
__setModuleDefault(result, mod);
|
||||
return result;
|
||||
};
|
||||
var __exportStar = (this && this.__exportStar) || function(m, exports) {
|
||||
for (var p in m) if (p !== "default" && !Object.prototype.hasOwnProperty.call(exports, p)) __createBinding(exports, m, p);
|
||||
};
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.parseString = exports.parseFile = exports.parseStream = exports.parse = exports.ParserOptions = exports.CsvParserStream = void 0;
|
||||
const fs = __importStar(require("fs"));
|
||||
const stream_1 = require("stream");
|
||||
const ParserOptions_1 = require("./ParserOptions");
|
||||
const CsvParserStream_1 = require("./CsvParserStream");
|
||||
__exportStar(require("./types"), exports);
|
||||
var CsvParserStream_2 = require("./CsvParserStream");
|
||||
Object.defineProperty(exports, "CsvParserStream", { enumerable: true, get: function () { return CsvParserStream_2.CsvParserStream; } });
|
||||
var ParserOptions_2 = require("./ParserOptions");
|
||||
Object.defineProperty(exports, "ParserOptions", { enumerable: true, get: function () { return ParserOptions_2.ParserOptions; } });
|
||||
exports.parse = (args) => new CsvParserStream_1.CsvParserStream(new ParserOptions_1.ParserOptions(args));
|
||||
exports.parseStream = (stream, options) => stream.pipe(new CsvParserStream_1.CsvParserStream(new ParserOptions_1.ParserOptions(options)));
|
||||
exports.parseFile = (location, options = {}) => fs.createReadStream(location).pipe(new CsvParserStream_1.CsvParserStream(new ParserOptions_1.ParserOptions(options)));
|
||||
exports.parseString = (string, options) => {
|
||||
const rs = new stream_1.Readable();
|
||||
rs.push(string);
|
||||
rs.push(null);
|
||||
return rs.pipe(new CsvParserStream_1.CsvParserStream(new ParserOptions_1.ParserOptions(options)));
|
||||
};
|
||||
//# sourceMappingURL=index.js.map
|
||||
1
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/index.js.map
generated
vendored
Normal file
1
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/index.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"index.js","sourceRoot":"","sources":["../../src/index.ts"],"names":[],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;AAAA,uCAAyB;AACzB,mCAAkC;AAClC,mDAAmE;AACnE,uDAAoD;AAGpD,0CAAwB;AACxB,qDAAoD;AAA3C,kHAAA,eAAe,OAAA;AACxB,iDAAmE;AAA1D,8GAAA,aAAa,OAAA;AAET,QAAA,KAAK,GAAG,CAA+B,IAAwB,EAAyB,EAAE,CACnG,IAAI,iCAAe,CAAC,IAAI,6BAAa,CAAC,IAAI,CAAC,CAAC,CAAC;AAEpC,QAAA,WAAW,GAAG,CACvB,MAA6B,EAC7B,OAA2B,EACN,EAAE,CAAC,MAAM,CAAC,IAAI,CAAC,IAAI,iCAAe,CAAC,IAAI,6BAAa,CAAC,OAAO,CAAC,CAAC,CAAC,CAAC;AAE5E,QAAA,SAAS,GAAG,CACrB,QAAgB,EAChB,UAA6B,EAAE,EACV,EAAE,CAAC,EAAE,CAAC,gBAAgB,CAAC,QAAQ,CAAC,CAAC,IAAI,CAAC,IAAI,iCAAe,CAAC,IAAI,6BAAa,CAAC,OAAO,CAAC,CAAC,CAAC,CAAC;AAEnG,QAAA,WAAW,GAAG,CACvB,MAAc,EACd,OAA2B,EACN,EAAE;IACvB,MAAM,EAAE,GAAG,IAAI,iBAAQ,EAAE,CAAC;IAC1B,EAAE,CAAC,IAAI,CAAC,MAAM,CAAC,CAAC;IAChB,EAAE,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC;IACd,OAAO,EAAE,CAAC,IAAI,CAAC,IAAI,iCAAe,CAAC,IAAI,6BAAa,CAAC,OAAO,CAAC,CAAC,CAAC,CAAC;AACpE,CAAC,CAAC"}
|
||||
15
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/parser/Parser.d.ts
generated
vendored
Normal file
15
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/parser/Parser.d.ts
generated
vendored
Normal file
@@ -0,0 +1,15 @@
|
||||
import { ParserOptions } from '../ParserOptions';
|
||||
export interface ParseResult {
|
||||
line: string;
|
||||
rows: string[][];
|
||||
}
|
||||
export declare class Parser {
|
||||
private static removeBOM;
|
||||
private readonly parserOptions;
|
||||
private readonly rowParser;
|
||||
constructor(parserOptions: ParserOptions);
|
||||
parse(line: string, hasMoreData: boolean): ParseResult;
|
||||
private parseWithoutComments;
|
||||
private parseWithComments;
|
||||
private parseRow;
|
||||
}
|
||||
76
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/parser/Parser.js
generated
vendored
Normal file
76
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/parser/Parser.js
generated
vendored
Normal file
@@ -0,0 +1,76 @@
|
||||
"use strict";
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.Parser = void 0;
|
||||
const Scanner_1 = require("./Scanner");
|
||||
const RowParser_1 = require("./RowParser");
|
||||
const Token_1 = require("./Token");
|
||||
class Parser {
|
||||
constructor(parserOptions) {
|
||||
this.parserOptions = parserOptions;
|
||||
this.rowParser = new RowParser_1.RowParser(this.parserOptions);
|
||||
}
|
||||
static removeBOM(line) {
|
||||
// Catches EFBBBF (UTF-8 BOM) because the buffer-to-string
|
||||
// conversion translates it to FEFF (UTF-16 BOM)
|
||||
if (line && line.charCodeAt(0) === 0xfeff) {
|
||||
return line.slice(1);
|
||||
}
|
||||
return line;
|
||||
}
|
||||
parse(line, hasMoreData) {
|
||||
const scanner = new Scanner_1.Scanner({
|
||||
line: Parser.removeBOM(line),
|
||||
parserOptions: this.parserOptions,
|
||||
hasMoreData,
|
||||
});
|
||||
if (this.parserOptions.supportsComments) {
|
||||
return this.parseWithComments(scanner);
|
||||
}
|
||||
return this.parseWithoutComments(scanner);
|
||||
}
|
||||
parseWithoutComments(scanner) {
|
||||
const rows = [];
|
||||
let shouldContinue = true;
|
||||
while (shouldContinue) {
|
||||
shouldContinue = this.parseRow(scanner, rows);
|
||||
}
|
||||
return { line: scanner.line, rows };
|
||||
}
|
||||
parseWithComments(scanner) {
|
||||
const { parserOptions } = this;
|
||||
const rows = [];
|
||||
for (let nextToken = scanner.nextCharacterToken; nextToken !== null; nextToken = scanner.nextCharacterToken) {
|
||||
if (Token_1.Token.isTokenComment(nextToken, parserOptions)) {
|
||||
const cursor = scanner.advancePastLine();
|
||||
if (cursor === null) {
|
||||
return { line: scanner.lineFromCursor, rows };
|
||||
}
|
||||
if (!scanner.hasMoreCharacters) {
|
||||
return { line: scanner.lineFromCursor, rows };
|
||||
}
|
||||
scanner.truncateToCursor();
|
||||
}
|
||||
else if (!this.parseRow(scanner, rows)) {
|
||||
break;
|
||||
}
|
||||
}
|
||||
return { line: scanner.line, rows };
|
||||
}
|
||||
parseRow(scanner, rows) {
|
||||
const nextToken = scanner.nextNonSpaceToken;
|
||||
if (!nextToken) {
|
||||
return false;
|
||||
}
|
||||
const row = this.rowParser.parse(scanner);
|
||||
if (row === null) {
|
||||
return false;
|
||||
}
|
||||
if (this.parserOptions.ignoreEmpty && RowParser_1.RowParser.isEmptyRow(row)) {
|
||||
return true;
|
||||
}
|
||||
rows.push(row);
|
||||
return true;
|
||||
}
|
||||
}
|
||||
exports.Parser = Parser;
|
||||
//# sourceMappingURL=Parser.js.map
|
||||
1
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/parser/Parser.js.map
generated
vendored
Normal file
1
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/parser/Parser.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"Parser.js","sourceRoot":"","sources":["../../../src/parser/Parser.ts"],"names":[],"mappings":";;;AAAA,uCAAoC;AACpC,2CAAwC;AAGxC,mCAAgC;AAMhC,MAAa,MAAM;IAcf,YAAmB,aAA4B;QAC3C,IAAI,CAAC,aAAa,GAAG,aAAa,CAAC;QACnC,IAAI,CAAC,SAAS,GAAG,IAAI,qBAAS,CAAC,IAAI,CAAC,aAAa,CAAC,CAAC;IACvD,CAAC;IAhBO,MAAM,CAAC,SAAS,CAAC,IAAY;QACjC,0DAA0D;QAC1D,gDAAgD;QAChD,IAAI,IAAI,IAAI,IAAI,CAAC,UAAU,CAAC,CAAC,CAAC,KAAK,MAAM,EAAE;YACvC,OAAO,IAAI,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC;SACxB;QACD,OAAO,IAAI,CAAC;IAChB,CAAC;IAWM,KAAK,CAAC,IAAY,EAAE,WAAoB;QAC3C,MAAM,OAAO,GAAG,IAAI,iBAAO,CAAC;YACxB,IAAI,EAAE,MAAM,CAAC,SAAS,CAAC,IAAI,CAAC;YAC5B,aAAa,EAAE,IAAI,CAAC,aAAa;YACjC,WAAW;SACd,CAAC,CAAC;QACH,IAAI,IAAI,CAAC,aAAa,CAAC,gBAAgB,EAAE;YACrC,OAAO,IAAI,CAAC,iBAAiB,CAAC,OAAO,CAAC,CAAC;SAC1C;QACD,OAAO,IAAI,CAAC,oBAAoB,CAAC,OAAO,CAAC,CAAC;IAC9C,CAAC;IAEO,oBAAoB,CAAC,OAAgB;QACzC,MAAM,IAAI,GAAe,EAAE,CAAC;QAC5B,IAAI,cAAc,GAAG,IAAI,CAAC;QAC1B,OAAO,cAAc,EAAE;YACnB,cAAc,GAAG,IAAI,CAAC,QAAQ,CAAC,OAAO,EAAE,IAAI,CAAC,CAAC;SACjD;QACD,OAAO,EAAE,IAAI,EAAE,OAAO,CAAC,IAAI,EAAE,IAAI,EAAE,CAAC;IACxC,CAAC;IAEO,iBAAiB,CAAC,OAAgB;QACtC,MAAM,EAAE,aAAa,EAAE,GAAG,IAAI,CAAC;QAC/B,MAAM,IAAI,GAAe,EAAE,CAAC;QAC5B,KAAK,IAAI,SAAS,GAAG,OAAO,CAAC,kBAAkB,EAAE,SAAS,KAAK,IAAI,EAAE,SAAS,GAAG,OAAO,CAAC,kBAAkB,EAAE;YACzG,IAAI,aAAK,CAAC,cAAc,CAAC,SAAS,EAAE,aAAa,CAAC,EAAE;gBAChD,MAAM,MAAM,GAAG,OAAO,CAAC,eAAe,EAAE,CAAC;gBACzC,IAAI,MAAM,KAAK,IAAI,EAAE;oBACjB,OAAO,EAAE,IAAI,EAAE,OAAO,CAAC,cAAc,EAAE,IAAI,EAAE,CAAC;iBACjD;gBACD,IAAI,CAAC,OAAO,CAAC,iBAAiB,EAAE;oBAC5B,OAAO,EAAE,IAAI,EAAE,OAAO,CAAC,cAAc,EAAE,IAAI,EAAE,CAAC;iBACjD;gBACD,OAAO,CAAC,gBAAgB,EAAE,CAAC;aAC9B;iBAAM,IAAI,CAAC,IAAI,CAAC,QAAQ,CAAC,OAAO,EAAE,IAAI,CAAC,EAAE;gBACtC,MAAM;aACT;SACJ;QACD,OAAO,EAAE,IAAI,EAAE,OAAO,CAAC,IAAI,EAAE,IAAI,EAAE,CAAC;IACxC,CAAC;IAEO,QAAQ,CAAC,OAAgB,EAAE,IAAgB;QAC/C,MAAM,SAAS,GAAG,OAAO,CAAC,iBAAiB,CAAC;QAC5C,IAAI,CAAC,SAAS,EAAE;YACZ,OAAO,KAAK,CAAC;SAChB;QACD,MAAM,GAAG,GAAG,IAAI,CAAC,SAAS,CAAC,KAAK,CAAC,OAAO,CAAC,CAAC;QAC1C,IAAI,GAAG,KAAK,IAAI,EAAE;YACd,OAAO,KAAK,CAAC;SAChB;QACD,IAAI,IAAI,CAAC,aAAa,CAAC,WAAW,IAAI,qBAAS,CAAC,UAAU,CAAC,GAAG,CAAC,EAAE;YAC7D,OAAO,IAAI,CAAC;SACf;QACD,IAAI,CAAC,IAAI,CAAC,GAAG,CAAC,CAAC;QACf,OAAO,IAAI,CAAC;IAChB,CAAC;CACJ;AA3ED,wBA2EC"}
|
||||
12
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/parser/RowParser.d.ts
generated
vendored
Normal file
12
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/parser/RowParser.d.ts
generated
vendored
Normal file
@@ -0,0 +1,12 @@
|
||||
import { Scanner } from './Scanner';
|
||||
import { ParserOptions } from '../ParserOptions';
|
||||
import { RowArray } from '../types';
|
||||
export declare class RowParser {
|
||||
static isEmptyRow(row: RowArray): boolean;
|
||||
private readonly parserOptions;
|
||||
private readonly columnParser;
|
||||
constructor(parserOptions: ParserOptions);
|
||||
parse(scanner: Scanner): RowArray | null;
|
||||
private getStartToken;
|
||||
private shouldSkipColumnParse;
|
||||
}
|
||||
76
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/parser/RowParser.js
generated
vendored
Normal file
76
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/parser/RowParser.js
generated
vendored
Normal file
@@ -0,0 +1,76 @@
|
||||
"use strict";
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.RowParser = void 0;
|
||||
const column_1 = require("./column");
|
||||
const Token_1 = require("./Token");
|
||||
const EMPTY_STRING = '';
|
||||
class RowParser {
|
||||
constructor(parserOptions) {
|
||||
this.parserOptions = parserOptions;
|
||||
this.columnParser = new column_1.ColumnParser(parserOptions);
|
||||
}
|
||||
static isEmptyRow(row) {
|
||||
return row.join(EMPTY_STRING).replace(/\s+/g, EMPTY_STRING) === EMPTY_STRING;
|
||||
}
|
||||
parse(scanner) {
|
||||
const { parserOptions } = this;
|
||||
const { hasMoreData } = scanner;
|
||||
const currentScanner = scanner;
|
||||
const columns = [];
|
||||
let currentToken = this.getStartToken(currentScanner, columns);
|
||||
while (currentToken) {
|
||||
if (Token_1.Token.isTokenRowDelimiter(currentToken)) {
|
||||
currentScanner.advancePastToken(currentToken);
|
||||
// if ends with CR and there is more data, keep unparsed due to possible
|
||||
// coming LF in CRLF
|
||||
if (!currentScanner.hasMoreCharacters &&
|
||||
Token_1.Token.isTokenCarriageReturn(currentToken, parserOptions) &&
|
||||
hasMoreData) {
|
||||
return null;
|
||||
}
|
||||
currentScanner.truncateToCursor();
|
||||
return columns;
|
||||
}
|
||||
if (!this.shouldSkipColumnParse(currentScanner, currentToken, columns)) {
|
||||
const item = this.columnParser.parse(currentScanner);
|
||||
if (item === null) {
|
||||
return null;
|
||||
}
|
||||
columns.push(item);
|
||||
}
|
||||
currentToken = currentScanner.nextNonSpaceToken;
|
||||
}
|
||||
if (!hasMoreData) {
|
||||
currentScanner.truncateToCursor();
|
||||
return columns;
|
||||
}
|
||||
return null;
|
||||
}
|
||||
getStartToken(scanner, columns) {
|
||||
const currentToken = scanner.nextNonSpaceToken;
|
||||
if (currentToken !== null && Token_1.Token.isTokenDelimiter(currentToken, this.parserOptions)) {
|
||||
columns.push('');
|
||||
return scanner.nextNonSpaceToken;
|
||||
}
|
||||
return currentToken;
|
||||
}
|
||||
shouldSkipColumnParse(scanner, currentToken, columns) {
|
||||
const { parserOptions } = this;
|
||||
if (Token_1.Token.isTokenDelimiter(currentToken, parserOptions)) {
|
||||
scanner.advancePastToken(currentToken);
|
||||
// if the delimiter is at the end of a line
|
||||
const nextToken = scanner.nextCharacterToken;
|
||||
if (!scanner.hasMoreCharacters || (nextToken !== null && Token_1.Token.isTokenRowDelimiter(nextToken))) {
|
||||
columns.push('');
|
||||
return true;
|
||||
}
|
||||
if (nextToken !== null && Token_1.Token.isTokenDelimiter(nextToken, parserOptions)) {
|
||||
columns.push('');
|
||||
return true;
|
||||
}
|
||||
}
|
||||
return false;
|
||||
}
|
||||
}
|
||||
exports.RowParser = RowParser;
|
||||
//# sourceMappingURL=RowParser.js.map
|
||||
1
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/parser/RowParser.js.map
generated
vendored
Normal file
1
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/parser/RowParser.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"RowParser.js","sourceRoot":"","sources":["../../../src/parser/RowParser.ts"],"names":[],"mappings":";;;AACA,qCAAwC;AAGxC,mCAA4C;AAE5C,MAAM,YAAY,GAAG,EAAE,CAAC;AAExB,MAAa,SAAS;IASlB,YAAmB,aAA4B;QAC3C,IAAI,CAAC,aAAa,GAAG,aAAa,CAAC;QACnC,IAAI,CAAC,YAAY,GAAG,IAAI,qBAAY,CAAC,aAAa,CAAC,CAAC;IACxD,CAAC;IAXD,MAAM,CAAC,UAAU,CAAC,GAAa;QAC3B,OAAO,GAAG,CAAC,IAAI,CAAC,YAAY,CAAC,CAAC,OAAO,CAAC,MAAM,EAAE,YAAY,CAAC,KAAK,YAAY,CAAC;IACjF,CAAC;IAWM,KAAK,CAAC,OAAgB;QACzB,MAAM,EAAE,aAAa,EAAE,GAAG,IAAI,CAAC;QAC/B,MAAM,EAAE,WAAW,EAAE,GAAG,OAAO,CAAC;QAChC,MAAM,cAAc,GAAG,OAAO,CAAC;QAC/B,MAAM,OAAO,GAAqB,EAAE,CAAC;QACrC,IAAI,YAAY,GAAG,IAAI,CAAC,aAAa,CAAC,cAAc,EAAE,OAAO,CAAC,CAAC;QAC/D,OAAO,YAAY,EAAE;YACjB,IAAI,aAAK,CAAC,mBAAmB,CAAC,YAAY,CAAC,EAAE;gBACzC,cAAc,CAAC,gBAAgB,CAAC,YAAY,CAAC,CAAC;gBAC9C,wEAAwE;gBACxE,oBAAoB;gBACpB,IACI,CAAC,cAAc,CAAC,iBAAiB;oBACjC,aAAK,CAAC,qBAAqB,CAAC,YAAY,EAAE,aAAa,CAAC;oBACxD,WAAW,EACb;oBACE,OAAO,IAAI,CAAC;iBACf;gBACD,cAAc,CAAC,gBAAgB,EAAE,CAAC;gBAClC,OAAO,OAAO,CAAC;aAClB;YACD,IAAI,CAAC,IAAI,CAAC,qBAAqB,CAAC,cAAc,EAAE,YAAY,EAAE,OAAO,CAAC,EAAE;gBACpE,MAAM,IAAI,GAAG,IAAI,CAAC,YAAY,CAAC,KAAK,CAAC,cAAc,CAAC,CAAC;gBACrD,IAAI,IAAI,KAAK,IAAI,EAAE;oBACf,OAAO,IAAI,CAAC;iBACf;gBACD,OAAO,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC;aACtB;YACD,YAAY,GAAG,cAAc,CAAC,iBAAiB,CAAC;SACnD;QACD,IAAI,CAAC,WAAW,EAAE;YACd,cAAc,CAAC,gBAAgB,EAAE,CAAC;YAClC,OAAO,OAAO,CAAC;SAClB;QACD,OAAO,IAAI,CAAC;IAChB,CAAC;IAEO,aAAa,CAAC,OAAgB,EAAE,OAAiB;QACrD,MAAM,YAAY,GAAG,OAAO,CAAC,iBAAiB,CAAC;QAC/C,IAAI,YAAY,KAAK,IAAI,IAAI,aAAK,CAAC,gBAAgB,CAAC,YAAY,EAAE,IAAI,CAAC,aAAa,CAAC,EAAE;YACnF,OAAO,CAAC,IAAI,CAAC,EAAE,CAAC,CAAC;YACjB,OAAO,OAAO,CAAC,iBAAiB,CAAC;SACpC;QACD,OAAO,YAAY,CAAC;IACxB,CAAC;IAEO,qBAAqB,CAAC,OAAgB,EAAE,YAAmB,EAAE,OAAiB;QAClF,MAAM,EAAE,aAAa,EAAE,GAAG,IAAI,CAAC;QAC/B,IAAI,aAAK,CAAC,gBAAgB,CAAC,YAAY,EAAE,aAAa,CAAC,EAAE;YACrD,OAAO,CAAC,gBAAgB,CAAC,YAAY,CAAC,CAAC;YACvC,2CAA2C;YAC3C,MAAM,SAAS,GAAG,OAAO,CAAC,kBAAkB,CAAC;YAC7C,IAAI,CAAC,OAAO,CAAC,iBAAiB,IAAI,CAAC,SAAS,KAAK,IAAI,IAAI,aAAK,CAAC,mBAAmB,CAAC,SAAS,CAAC,CAAC,EAAE;gBAC5F,OAAO,CAAC,IAAI,CAAC,EAAE,CAAC,CAAC;gBACjB,OAAO,IAAI,CAAC;aACf;YACD,IAAI,SAAS,KAAK,IAAI,IAAI,aAAK,CAAC,gBAAgB,CAAC,SAAS,EAAE,aAAa,CAAC,EAAE;gBACxE,OAAO,CAAC,IAAI,CAAC,EAAE,CAAC,CAAC;gBACjB,OAAO,IAAI,CAAC;aACf;SACJ;QACD,OAAO,KAAK,CAAC;IACjB,CAAC;CACJ;AA7ED,8BA6EC"}
|
||||
25
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/parser/Scanner.d.ts
generated
vendored
Normal file
25
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/parser/Scanner.d.ts
generated
vendored
Normal file
@@ -0,0 +1,25 @@
|
||||
import { ParserOptions } from '../ParserOptions';
|
||||
import { MaybeToken, Token } from './Token';
|
||||
export interface ScannerArgs {
|
||||
line: string;
|
||||
parserOptions: ParserOptions;
|
||||
hasMoreData: boolean;
|
||||
cursor?: number;
|
||||
}
|
||||
export declare class Scanner {
|
||||
line: string;
|
||||
private readonly parserOptions;
|
||||
lineLength: number;
|
||||
readonly hasMoreData: boolean;
|
||||
cursor: number;
|
||||
constructor(args: ScannerArgs);
|
||||
get hasMoreCharacters(): boolean;
|
||||
get nextNonSpaceToken(): MaybeToken;
|
||||
get nextCharacterToken(): MaybeToken;
|
||||
get lineFromCursor(): string;
|
||||
advancePastLine(): Scanner | null;
|
||||
advanceTo(cursor: number): Scanner;
|
||||
advanceToToken(token: Token): Scanner;
|
||||
advancePastToken(token: Token): Scanner;
|
||||
truncateToCursor(): Scanner;
|
||||
}
|
||||
82
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/parser/Scanner.js
generated
vendored
Normal file
82
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/parser/Scanner.js
generated
vendored
Normal file
@@ -0,0 +1,82 @@
|
||||
"use strict";
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.Scanner = void 0;
|
||||
const Token_1 = require("./Token");
|
||||
const ROW_DELIMITER = /((?:\r\n)|\n|\r)/;
|
||||
class Scanner {
|
||||
constructor(args) {
|
||||
this.cursor = 0;
|
||||
this.line = args.line;
|
||||
this.lineLength = this.line.length;
|
||||
this.parserOptions = args.parserOptions;
|
||||
this.hasMoreData = args.hasMoreData;
|
||||
this.cursor = args.cursor || 0;
|
||||
}
|
||||
get hasMoreCharacters() {
|
||||
return this.lineLength > this.cursor;
|
||||
}
|
||||
get nextNonSpaceToken() {
|
||||
const { lineFromCursor } = this;
|
||||
const regex = this.parserOptions.NEXT_TOKEN_REGEXP;
|
||||
if (lineFromCursor.search(regex) === -1) {
|
||||
return null;
|
||||
}
|
||||
const match = regex.exec(lineFromCursor);
|
||||
if (match == null) {
|
||||
return null;
|
||||
}
|
||||
const token = match[1];
|
||||
const startCursor = this.cursor + (match.index || 0);
|
||||
return new Token_1.Token({
|
||||
token,
|
||||
startCursor,
|
||||
endCursor: startCursor + token.length - 1,
|
||||
});
|
||||
}
|
||||
get nextCharacterToken() {
|
||||
const { cursor, lineLength } = this;
|
||||
if (lineLength <= cursor) {
|
||||
return null;
|
||||
}
|
||||
return new Token_1.Token({
|
||||
token: this.line[cursor],
|
||||
startCursor: cursor,
|
||||
endCursor: cursor,
|
||||
});
|
||||
}
|
||||
get lineFromCursor() {
|
||||
return this.line.substr(this.cursor);
|
||||
}
|
||||
advancePastLine() {
|
||||
const match = ROW_DELIMITER.exec(this.lineFromCursor);
|
||||
if (!match) {
|
||||
if (this.hasMoreData) {
|
||||
return null;
|
||||
}
|
||||
this.cursor = this.lineLength;
|
||||
return this;
|
||||
}
|
||||
this.cursor += (match.index || 0) + match[0].length;
|
||||
return this;
|
||||
}
|
||||
advanceTo(cursor) {
|
||||
this.cursor = cursor;
|
||||
return this;
|
||||
}
|
||||
advanceToToken(token) {
|
||||
this.cursor = token.startCursor;
|
||||
return this;
|
||||
}
|
||||
advancePastToken(token) {
|
||||
this.cursor = token.endCursor + 1;
|
||||
return this;
|
||||
}
|
||||
truncateToCursor() {
|
||||
this.line = this.lineFromCursor;
|
||||
this.lineLength = this.line.length;
|
||||
this.cursor = 0;
|
||||
return this;
|
||||
}
|
||||
}
|
||||
exports.Scanner = Scanner;
|
||||
//# sourceMappingURL=Scanner.js.map
|
||||
1
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/parser/Scanner.js.map
generated
vendored
Normal file
1
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/parser/Scanner.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"Scanner.js","sourceRoot":"","sources":["../../../src/parser/Scanner.ts"],"names":[],"mappings":";;;AACA,mCAA4C;AAE5C,MAAM,aAAa,GAAG,kBAAkB,CAAC;AASzC,MAAa,OAAO;IAWhB,YAAmB,IAAiB;QAF7B,WAAM,GAAG,CAAC,CAAC;QAGd,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC,IAAI,CAAC;QACtB,IAAI,CAAC,UAAU,GAAG,IAAI,CAAC,IAAI,CAAC,MAAM,CAAC;QACnC,IAAI,CAAC,aAAa,GAAG,IAAI,CAAC,aAAa,CAAC;QACxC,IAAI,CAAC,WAAW,GAAG,IAAI,CAAC,WAAW,CAAC;QACpC,IAAI,CAAC,MAAM,GAAG,IAAI,CAAC,MAAM,IAAI,CAAC,CAAC;IACnC,CAAC;IAED,IAAW,iBAAiB;QACxB,OAAO,IAAI,CAAC,UAAU,GAAG,IAAI,CAAC,MAAM,CAAC;IACzC,CAAC;IAED,IAAW,iBAAiB;QACxB,MAAM,EAAE,cAAc,EAAE,GAAG,IAAI,CAAC;QAChC,MAAM,KAAK,GAAG,IAAI,CAAC,aAAa,CAAC,iBAAiB,CAAC;QACnD,IAAI,cAAc,CAAC,MAAM,CAAC,KAAK,CAAC,KAAK,CAAC,CAAC,EAAE;YACrC,OAAO,IAAI,CAAC;SACf;QACD,MAAM,KAAK,GAAG,KAAK,CAAC,IAAI,CAAC,cAAc,CAAC,CAAC;QACzC,IAAI,KAAK,IAAI,IAAI,EAAE;YACf,OAAO,IAAI,CAAC;SACf;QACD,MAAM,KAAK,GAAG,KAAK,CAAC,CAAC,CAAC,CAAC;QACvB,MAAM,WAAW,GAAG,IAAI,CAAC,MAAM,GAAG,CAAC,KAAK,CAAC,KAAK,IAAI,CAAC,CAAC,CAAC;QACrD,OAAO,IAAI,aAAK,CAAC;YACb,KAAK;YACL,WAAW;YACX,SAAS,EAAE,WAAW,GAAG,KAAK,CAAC,MAAM,GAAG,CAAC;SAC5C,CAAC,CAAC;IACP,CAAC;IAED,IAAW,kBAAkB;QACzB,MAAM,EAAE,MAAM,EAAE,UAAU,EAAE,GAAG,IAAI,CAAC;QACpC,IAAI,UAAU,IAAI,MAAM,EAAE;YACtB,OAAO,IAAI,CAAC;SACf;QACD,OAAO,IAAI,aAAK,CAAC;YACb,KAAK,EAAE,IAAI,CAAC,IAAI,CAAC,MAAM,CAAC;YACxB,WAAW,EAAE,MAAM;YACnB,SAAS,EAAE,MAAM;SACpB,CAAC,CAAC;IACP,CAAC;IAED,IAAW,cAAc;QACrB,OAAO,IAAI,CAAC,IAAI,CAAC,MAAM,CAAC,IAAI,CAAC,MAAM,CAAC,CAAC;IACzC,CAAC;IAEM,eAAe;QAClB,MAAM,KAAK,GAAG,aAAa,CAAC,IAAI,CAAC,IAAI,CAAC,cAAc,CAAC,CAAC;QACtD,IAAI,CAAC,KAAK,EAAE;YACR,IAAI,IAAI,CAAC,WAAW,EAAE;gBAClB,OAAO,IAAI,CAAC;aACf;YACD,IAAI,CAAC,MAAM,GAAG,IAAI,CAAC,UAAU,CAAC;YAC9B,OAAO,IAAI,CAAC;SACf;QACD,IAAI,CAAC,MAAM,IAAI,CAAC,KAAK,CAAC,KAAK,IAAI,CAAC,CAAC,GAAG,KAAK,CAAC,CAAC,CAAC,CAAC,MAAM,CAAC;QACpD,OAAO,IAAI,CAAC;IAChB,CAAC;IAEM,SAAS,CAAC,MAAc;QAC3B,IAAI,CAAC,MAAM,GAAG,MAAM,CAAC;QACrB,OAAO,IAAI,CAAC;IAChB,CAAC;IAEM,cAAc,CAAC,KAAY;QAC9B,IAAI,CAAC,MAAM,GAAG,KAAK,CAAC,WAAW,CAAC;QAChC,OAAO,IAAI,CAAC;IAChB,CAAC;IAEM,gBAAgB,CAAC,KAAY;QAChC,IAAI,CAAC,MAAM,GAAG,KAAK,CAAC,SAAS,GAAG,CAAC,CAAC;QAClC,OAAO,IAAI,CAAC;IAChB,CAAC;IAEM,gBAAgB;QACnB,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC,cAAc,CAAC;QAChC,IAAI,CAAC,UAAU,GAAG,IAAI,CAAC,IAAI,CAAC,MAAM,CAAC;QACnC,IAAI,CAAC,MAAM,GAAG,CAAC,CAAC;QAChB,OAAO,IAAI,CAAC;IAChB,CAAC;CACJ;AA5FD,0BA4FC"}
|
||||
19
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/parser/Token.d.ts
generated
vendored
Normal file
19
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/parser/Token.d.ts
generated
vendored
Normal file
@@ -0,0 +1,19 @@
|
||||
import { ParserOptions } from '../ParserOptions';
|
||||
export declare type MaybeToken = Token | null;
|
||||
export interface TokenArgs {
|
||||
token: string;
|
||||
startCursor: number;
|
||||
endCursor: number;
|
||||
}
|
||||
export declare class Token {
|
||||
static isTokenRowDelimiter(token: Token): boolean;
|
||||
static isTokenCarriageReturn(token: Token, parserOptions: ParserOptions): boolean;
|
||||
static isTokenComment(token: Token, parserOptions: ParserOptions): boolean;
|
||||
static isTokenEscapeCharacter(token: Token, parserOptions: ParserOptions): boolean;
|
||||
static isTokenQuote(token: Token, parserOptions: ParserOptions): boolean;
|
||||
static isTokenDelimiter(token: Token, parserOptions: ParserOptions): boolean;
|
||||
readonly token: string;
|
||||
readonly startCursor: number;
|
||||
readonly endCursor: number;
|
||||
constructor(tokenArgs: TokenArgs);
|
||||
}
|
||||
31
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/parser/Token.js
generated
vendored
Normal file
31
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/parser/Token.js
generated
vendored
Normal file
@@ -0,0 +1,31 @@
|
||||
"use strict";
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.Token = void 0;
|
||||
class Token {
|
||||
constructor(tokenArgs) {
|
||||
this.token = tokenArgs.token;
|
||||
this.startCursor = tokenArgs.startCursor;
|
||||
this.endCursor = tokenArgs.endCursor;
|
||||
}
|
||||
static isTokenRowDelimiter(token) {
|
||||
const content = token.token;
|
||||
return content === '\r' || content === '\n' || content === '\r\n';
|
||||
}
|
||||
static isTokenCarriageReturn(token, parserOptions) {
|
||||
return token.token === parserOptions.carriageReturn;
|
||||
}
|
||||
static isTokenComment(token, parserOptions) {
|
||||
return parserOptions.supportsComments && !!token && token.token === parserOptions.comment;
|
||||
}
|
||||
static isTokenEscapeCharacter(token, parserOptions) {
|
||||
return token.token === parserOptions.escapeChar;
|
||||
}
|
||||
static isTokenQuote(token, parserOptions) {
|
||||
return token.token === parserOptions.quote;
|
||||
}
|
||||
static isTokenDelimiter(token, parserOptions) {
|
||||
return token.token === parserOptions.delimiter;
|
||||
}
|
||||
}
|
||||
exports.Token = Token;
|
||||
//# sourceMappingURL=Token.js.map
|
||||
1
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/parser/Token.js.map
generated
vendored
Normal file
1
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/parser/Token.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"Token.js","sourceRoot":"","sources":["../../../src/parser/Token.ts"],"names":[],"mappings":";;;AAUA,MAAa,KAAK;IAgCd,YAAmB,SAAoB;QACnC,IAAI,CAAC,KAAK,GAAG,SAAS,CAAC,KAAK,CAAC;QAC7B,IAAI,CAAC,WAAW,GAAG,SAAS,CAAC,WAAW,CAAC;QACzC,IAAI,CAAC,SAAS,GAAG,SAAS,CAAC,SAAS,CAAC;IACzC,CAAC;IAnCM,MAAM,CAAC,mBAAmB,CAAC,KAAY;QAC1C,MAAM,OAAO,GAAG,KAAK,CAAC,KAAK,CAAC;QAC5B,OAAO,OAAO,KAAK,IAAI,IAAI,OAAO,KAAK,IAAI,IAAI,OAAO,KAAK,MAAM,CAAC;IACtE,CAAC;IAEM,MAAM,CAAC,qBAAqB,CAAC,KAAY,EAAE,aAA4B;QAC1E,OAAO,KAAK,CAAC,KAAK,KAAK,aAAa,CAAC,cAAc,CAAC;IACxD,CAAC;IAEM,MAAM,CAAC,cAAc,CAAC,KAAY,EAAE,aAA4B;QACnE,OAAO,aAAa,CAAC,gBAAgB,IAAI,CAAC,CAAC,KAAK,IAAI,KAAK,CAAC,KAAK,KAAK,aAAa,CAAC,OAAO,CAAC;IAC9F,CAAC;IAEM,MAAM,CAAC,sBAAsB,CAAC,KAAY,EAAE,aAA4B;QAC3E,OAAO,KAAK,CAAC,KAAK,KAAK,aAAa,CAAC,UAAU,CAAC;IACpD,CAAC;IAEM,MAAM,CAAC,YAAY,CAAC,KAAY,EAAE,aAA4B;QACjE,OAAO,KAAK,CAAC,KAAK,KAAK,aAAa,CAAC,KAAK,CAAC;IAC/C,CAAC;IAEM,MAAM,CAAC,gBAAgB,CAAC,KAAY,EAAE,aAA4B;QACrE,OAAO,KAAK,CAAC,KAAK,KAAK,aAAa,CAAC,SAAS,CAAC;IACnD,CAAC;CAaJ;AArCD,sBAqCC"}
|
||||
5
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/parser/column/ColumnFormatter.d.ts
generated
vendored
Normal file
5
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/parser/column/ColumnFormatter.d.ts
generated
vendored
Normal file
@@ -0,0 +1,5 @@
|
||||
import { ParserOptions } from '../../ParserOptions';
|
||||
export declare class ColumnFormatter {
|
||||
readonly format: (col: string) => string;
|
||||
constructor(parserOptions: ParserOptions);
|
||||
}
|
||||
21
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/parser/column/ColumnFormatter.js
generated
vendored
Normal file
21
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/parser/column/ColumnFormatter.js
generated
vendored
Normal file
@@ -0,0 +1,21 @@
|
||||
"use strict";
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.ColumnFormatter = void 0;
|
||||
class ColumnFormatter {
|
||||
constructor(parserOptions) {
|
||||
if (parserOptions.trim) {
|
||||
this.format = (col) => col.trim();
|
||||
}
|
||||
else if (parserOptions.ltrim) {
|
||||
this.format = (col) => col.trimLeft();
|
||||
}
|
||||
else if (parserOptions.rtrim) {
|
||||
this.format = (col) => col.trimRight();
|
||||
}
|
||||
else {
|
||||
this.format = (col) => col;
|
||||
}
|
||||
}
|
||||
}
|
||||
exports.ColumnFormatter = ColumnFormatter;
|
||||
//# sourceMappingURL=ColumnFormatter.js.map
|
||||
1
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/parser/column/ColumnFormatter.js.map
generated
vendored
Normal file
1
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/parser/column/ColumnFormatter.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"ColumnFormatter.js","sourceRoot":"","sources":["../../../../src/parser/column/ColumnFormatter.ts"],"names":[],"mappings":";;;AAEA,MAAa,eAAe;IAGxB,YAAmB,aAA4B;QAC3C,IAAI,aAAa,CAAC,IAAI,EAAE;YACpB,IAAI,CAAC,MAAM,GAAG,CAAC,GAAW,EAAU,EAAE,CAAC,GAAG,CAAC,IAAI,EAAE,CAAC;SACrD;aAAM,IAAI,aAAa,CAAC,KAAK,EAAE;YAC5B,IAAI,CAAC,MAAM,GAAG,CAAC,GAAW,EAAU,EAAE,CAAC,GAAG,CAAC,QAAQ,EAAE,CAAC;SACzD;aAAM,IAAI,aAAa,CAAC,KAAK,EAAE;YAC5B,IAAI,CAAC,MAAM,GAAG,CAAC,GAAW,EAAU,EAAE,CAAC,GAAG,CAAC,SAAS,EAAE,CAAC;SAC1D;aAAM;YACH,IAAI,CAAC,MAAM,GAAG,CAAC,GAAW,EAAU,EAAE,CAAC,GAAG,CAAC;SAC9C;IACL,CAAC;CACJ;AAdD,0CAcC"}
|
||||
11
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/parser/column/ColumnParser.d.ts
generated
vendored
Normal file
11
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/parser/column/ColumnParser.d.ts
generated
vendored
Normal file
@@ -0,0 +1,11 @@
|
||||
import { ParserOptions } from '../../ParserOptions';
|
||||
import { NonQuotedColumnParser } from './NonQuotedColumnParser';
|
||||
import { QuotedColumnParser } from './QuotedColumnParser';
|
||||
import { Scanner } from '../Scanner';
|
||||
export declare class ColumnParser {
|
||||
private readonly parserOptions;
|
||||
readonly nonQuotedColumnParser: NonQuotedColumnParser;
|
||||
readonly quotedColumnParser: QuotedColumnParser;
|
||||
constructor(parserOptions: ParserOptions);
|
||||
parse(scanner: Scanner): string | null;
|
||||
}
|
||||
23
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/parser/column/ColumnParser.js
generated
vendored
Normal file
23
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/parser/column/ColumnParser.js
generated
vendored
Normal file
@@ -0,0 +1,23 @@
|
||||
"use strict";
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.ColumnParser = void 0;
|
||||
const NonQuotedColumnParser_1 = require("./NonQuotedColumnParser");
|
||||
const QuotedColumnParser_1 = require("./QuotedColumnParser");
|
||||
const Token_1 = require("../Token");
|
||||
class ColumnParser {
|
||||
constructor(parserOptions) {
|
||||
this.parserOptions = parserOptions;
|
||||
this.quotedColumnParser = new QuotedColumnParser_1.QuotedColumnParser(parserOptions);
|
||||
this.nonQuotedColumnParser = new NonQuotedColumnParser_1.NonQuotedColumnParser(parserOptions);
|
||||
}
|
||||
parse(scanner) {
|
||||
const { nextNonSpaceToken } = scanner;
|
||||
if (nextNonSpaceToken !== null && Token_1.Token.isTokenQuote(nextNonSpaceToken, this.parserOptions)) {
|
||||
scanner.advanceToToken(nextNonSpaceToken);
|
||||
return this.quotedColumnParser.parse(scanner);
|
||||
}
|
||||
return this.nonQuotedColumnParser.parse(scanner);
|
||||
}
|
||||
}
|
||||
exports.ColumnParser = ColumnParser;
|
||||
//# sourceMappingURL=ColumnParser.js.map
|
||||
1
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/parser/column/ColumnParser.js.map
generated
vendored
Normal file
1
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/parser/column/ColumnParser.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"ColumnParser.js","sourceRoot":"","sources":["../../../../src/parser/column/ColumnParser.ts"],"names":[],"mappings":";;;AACA,mEAAgE;AAChE,6DAA0D;AAE1D,oCAAiC;AAEjC,MAAa,YAAY;IAOrB,YAAmB,aAA4B;QAC3C,IAAI,CAAC,aAAa,GAAG,aAAa,CAAC;QACnC,IAAI,CAAC,kBAAkB,GAAG,IAAI,uCAAkB,CAAC,aAAa,CAAC,CAAC;QAChE,IAAI,CAAC,qBAAqB,GAAG,IAAI,6CAAqB,CAAC,aAAa,CAAC,CAAC;IAC1E,CAAC;IAEM,KAAK,CAAC,OAAgB;QACzB,MAAM,EAAE,iBAAiB,EAAE,GAAG,OAAO,CAAC;QACtC,IAAI,iBAAiB,KAAK,IAAI,IAAI,aAAK,CAAC,YAAY,CAAC,iBAAiB,EAAE,IAAI,CAAC,aAAa,CAAC,EAAE;YACzF,OAAO,CAAC,cAAc,CAAC,iBAAiB,CAAC,CAAC;YAC1C,OAAO,IAAI,CAAC,kBAAkB,CAAC,KAAK,CAAC,OAAO,CAAC,CAAC;SACjD;QACD,OAAO,IAAI,CAAC,qBAAqB,CAAC,KAAK,CAAC,OAAO,CAAC,CAAC;IACrD,CAAC;CACJ;AArBD,oCAqBC"}
|
||||
8
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/parser/column/NonQuotedColumnParser.d.ts
generated
vendored
Normal file
8
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/parser/column/NonQuotedColumnParser.d.ts
generated
vendored
Normal file
@@ -0,0 +1,8 @@
|
||||
import { ParserOptions } from '../../ParserOptions';
|
||||
import { Scanner } from '../Scanner';
|
||||
export declare class NonQuotedColumnParser {
|
||||
private readonly parserOptions;
|
||||
private readonly columnFormatter;
|
||||
constructor(parserOptions: ParserOptions);
|
||||
parse(scanner: Scanner): string | null;
|
||||
}
|
||||
29
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/parser/column/NonQuotedColumnParser.js
generated
vendored
Normal file
29
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/parser/column/NonQuotedColumnParser.js
generated
vendored
Normal file
@@ -0,0 +1,29 @@
|
||||
"use strict";
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.NonQuotedColumnParser = void 0;
|
||||
const ColumnFormatter_1 = require("./ColumnFormatter");
|
||||
const Token_1 = require("../Token");
|
||||
class NonQuotedColumnParser {
|
||||
constructor(parserOptions) {
|
||||
this.parserOptions = parserOptions;
|
||||
this.columnFormatter = new ColumnFormatter_1.ColumnFormatter(parserOptions);
|
||||
}
|
||||
parse(scanner) {
|
||||
if (!scanner.hasMoreCharacters) {
|
||||
return null;
|
||||
}
|
||||
const { parserOptions } = this;
|
||||
const characters = [];
|
||||
let nextToken = scanner.nextCharacterToken;
|
||||
for (; nextToken; nextToken = scanner.nextCharacterToken) {
|
||||
if (Token_1.Token.isTokenDelimiter(nextToken, parserOptions) || Token_1.Token.isTokenRowDelimiter(nextToken)) {
|
||||
break;
|
||||
}
|
||||
characters.push(nextToken.token);
|
||||
scanner.advancePastToken(nextToken);
|
||||
}
|
||||
return this.columnFormatter.format(characters.join(''));
|
||||
}
|
||||
}
|
||||
exports.NonQuotedColumnParser = NonQuotedColumnParser;
|
||||
//# sourceMappingURL=NonQuotedColumnParser.js.map
|
||||
1
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/parser/column/NonQuotedColumnParser.js.map
generated
vendored
Normal file
1
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/parser/column/NonQuotedColumnParser.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"NonQuotedColumnParser.js","sourceRoot":"","sources":["../../../../src/parser/column/NonQuotedColumnParser.ts"],"names":[],"mappings":";;;AACA,uDAAoD;AAEpD,oCAAiC;AAEjC,MAAa,qBAAqB;IAK9B,YAAmB,aAA4B;QAC3C,IAAI,CAAC,aAAa,GAAG,aAAa,CAAC;QACnC,IAAI,CAAC,eAAe,GAAG,IAAI,iCAAe,CAAC,aAAa,CAAC,CAAC;IAC9D,CAAC;IAEM,KAAK,CAAC,OAAgB;QACzB,IAAI,CAAC,OAAO,CAAC,iBAAiB,EAAE;YAC5B,OAAO,IAAI,CAAC;SACf;QACD,MAAM,EAAE,aAAa,EAAE,GAAG,IAAI,CAAC;QAC/B,MAAM,UAAU,GAAG,EAAE,CAAC;QACtB,IAAI,SAAS,GAAG,OAAO,CAAC,kBAAkB,CAAC;QAC3C,OAAO,SAAS,EAAE,SAAS,GAAG,OAAO,CAAC,kBAAkB,EAAE;YACtD,IAAI,aAAK,CAAC,gBAAgB,CAAC,SAAS,EAAE,aAAa,CAAC,IAAI,aAAK,CAAC,mBAAmB,CAAC,SAAS,CAAC,EAAE;gBAC1F,MAAM;aACT;YACD,UAAU,CAAC,IAAI,CAAC,SAAS,CAAC,KAAK,CAAC,CAAC;YACjC,OAAO,CAAC,gBAAgB,CAAC,SAAS,CAAC,CAAC;SACvC;QACD,OAAO,IAAI,CAAC,eAAe,CAAC,MAAM,CAAC,UAAU,CAAC,IAAI,CAAC,EAAE,CAAC,CAAC,CAAC;IAC5D,CAAC;CACJ;AA1BD,sDA0BC"}
|
||||
10
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/parser/column/QuotedColumnParser.d.ts
generated
vendored
Normal file
10
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/parser/column/QuotedColumnParser.d.ts
generated
vendored
Normal file
@@ -0,0 +1,10 @@
|
||||
import { ParserOptions } from '../../ParserOptions';
|
||||
import { Scanner } from '../Scanner';
|
||||
export declare class QuotedColumnParser {
|
||||
private readonly parserOptions;
|
||||
private readonly columnFormatter;
|
||||
constructor(parserOptions: ParserOptions);
|
||||
parse(scanner: Scanner): string | null;
|
||||
private gatherDataBetweenQuotes;
|
||||
private checkForMalformedColumn;
|
||||
}
|
||||
97
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/parser/column/QuotedColumnParser.js
generated
vendored
Normal file
97
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/parser/column/QuotedColumnParser.js
generated
vendored
Normal file
@@ -0,0 +1,97 @@
|
||||
"use strict";
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.QuotedColumnParser = void 0;
|
||||
const ColumnFormatter_1 = require("./ColumnFormatter");
|
||||
const Token_1 = require("../Token");
|
||||
class QuotedColumnParser {
|
||||
constructor(parserOptions) {
|
||||
this.parserOptions = parserOptions;
|
||||
this.columnFormatter = new ColumnFormatter_1.ColumnFormatter(parserOptions);
|
||||
}
|
||||
parse(scanner) {
|
||||
if (!scanner.hasMoreCharacters) {
|
||||
return null;
|
||||
}
|
||||
const originalCursor = scanner.cursor;
|
||||
const { foundClosingQuote, col } = this.gatherDataBetweenQuotes(scanner);
|
||||
if (!foundClosingQuote) {
|
||||
// reset the cursor to the original
|
||||
scanner.advanceTo(originalCursor);
|
||||
// if we didnt find a closing quote but we potentially have more data then skip the parsing
|
||||
// and return the original scanner.
|
||||
if (!scanner.hasMoreData) {
|
||||
throw new Error(`Parse Error: missing closing: '${this.parserOptions.quote || ''}' in line: at '${scanner.lineFromCursor.replace(/[\r\n]/g, "\\n'")}'`);
|
||||
}
|
||||
return null;
|
||||
}
|
||||
this.checkForMalformedColumn(scanner);
|
||||
return col;
|
||||
}
|
||||
gatherDataBetweenQuotes(scanner) {
|
||||
const { parserOptions } = this;
|
||||
let foundStartingQuote = false;
|
||||
let foundClosingQuote = false;
|
||||
const characters = [];
|
||||
let nextToken = scanner.nextCharacterToken;
|
||||
for (; !foundClosingQuote && nextToken !== null; nextToken = scanner.nextCharacterToken) {
|
||||
const isQuote = Token_1.Token.isTokenQuote(nextToken, parserOptions);
|
||||
// ignore first quote
|
||||
if (!foundStartingQuote && isQuote) {
|
||||
foundStartingQuote = true;
|
||||
}
|
||||
else if (foundStartingQuote) {
|
||||
if (Token_1.Token.isTokenEscapeCharacter(nextToken, parserOptions)) {
|
||||
// advance past the escape character so we can get the next one in line
|
||||
scanner.advancePastToken(nextToken);
|
||||
const tokenFollowingEscape = scanner.nextCharacterToken;
|
||||
// if the character following the escape is a quote character then just add
|
||||
// the quote and advance to that character
|
||||
if (tokenFollowingEscape !== null &&
|
||||
(Token_1.Token.isTokenQuote(tokenFollowingEscape, parserOptions) ||
|
||||
Token_1.Token.isTokenEscapeCharacter(tokenFollowingEscape, parserOptions))) {
|
||||
characters.push(tokenFollowingEscape.token);
|
||||
nextToken = tokenFollowingEscape;
|
||||
}
|
||||
else if (isQuote) {
|
||||
// if the escape is also a quote then we found our closing quote and finish early
|
||||
foundClosingQuote = true;
|
||||
}
|
||||
else {
|
||||
// other wise add the escape token to the characters since it wast escaping anything
|
||||
characters.push(nextToken.token);
|
||||
}
|
||||
}
|
||||
else if (isQuote) {
|
||||
// we found our closing quote!
|
||||
foundClosingQuote = true;
|
||||
}
|
||||
else {
|
||||
// add the token to the characters
|
||||
characters.push(nextToken.token);
|
||||
}
|
||||
}
|
||||
scanner.advancePastToken(nextToken);
|
||||
}
|
||||
return { col: this.columnFormatter.format(characters.join('')), foundClosingQuote };
|
||||
}
|
||||
checkForMalformedColumn(scanner) {
|
||||
const { parserOptions } = this;
|
||||
const { nextNonSpaceToken } = scanner;
|
||||
if (nextNonSpaceToken) {
|
||||
const isNextTokenADelimiter = Token_1.Token.isTokenDelimiter(nextNonSpaceToken, parserOptions);
|
||||
const isNextTokenARowDelimiter = Token_1.Token.isTokenRowDelimiter(nextNonSpaceToken);
|
||||
if (!(isNextTokenADelimiter || isNextTokenARowDelimiter)) {
|
||||
// if the final quote was NOT followed by a column (,) or row(\n) delimiter then its a bad column
|
||||
// tldr: only part of the column was quoted
|
||||
const linePreview = scanner.lineFromCursor.substr(0, 10).replace(/[\r\n]/g, "\\n'");
|
||||
throw new Error(`Parse Error: expected: '${parserOptions.escapedDelimiter}' OR new line got: '${nextNonSpaceToken.token}'. at '${linePreview}`);
|
||||
}
|
||||
scanner.advanceToToken(nextNonSpaceToken);
|
||||
}
|
||||
else if (!scanner.hasMoreData) {
|
||||
scanner.advancePastLine();
|
||||
}
|
||||
}
|
||||
}
|
||||
exports.QuotedColumnParser = QuotedColumnParser;
|
||||
//# sourceMappingURL=QuotedColumnParser.js.map
|
||||
1
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/parser/column/QuotedColumnParser.js.map
generated
vendored
Normal file
1
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/parser/column/QuotedColumnParser.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"QuotedColumnParser.js","sourceRoot":"","sources":["../../../../src/parser/column/QuotedColumnParser.ts"],"names":[],"mappings":";;;AAAA,uDAAoD;AAGpD,oCAAiC;AAOjC,MAAa,kBAAkB;IAK3B,YAAmB,aAA4B;QAC3C,IAAI,CAAC,aAAa,GAAG,aAAa,CAAC;QACnC,IAAI,CAAC,eAAe,GAAG,IAAI,iCAAe,CAAC,aAAa,CAAC,CAAC;IAC9D,CAAC;IAEM,KAAK,CAAC,OAAgB;QACzB,IAAI,CAAC,OAAO,CAAC,iBAAiB,EAAE;YAC5B,OAAO,IAAI,CAAC;SACf;QACD,MAAM,cAAc,GAAG,OAAO,CAAC,MAAM,CAAC;QACtC,MAAM,EAAE,iBAAiB,EAAE,GAAG,EAAE,GAAG,IAAI,CAAC,uBAAuB,CAAC,OAAO,CAAC,CAAC;QACzE,IAAI,CAAC,iBAAiB,EAAE;YACpB,mCAAmC;YACnC,OAAO,CAAC,SAAS,CAAC,cAAc,CAAC,CAAC;YAClC,2FAA2F;YAC3F,mCAAmC;YACnC,IAAI,CAAC,OAAO,CAAC,WAAW,EAAE;gBACtB,MAAM,IAAI,KAAK,CACX,kCACI,IAAI,CAAC,aAAa,CAAC,KAAK,IAAI,EAChC,kBAAkB,OAAO,CAAC,cAAc,CAAC,OAAO,CAAC,SAAS,EAAE,MAAM,CAAC,GAAG,CACzE,CAAC;aACL;YACD,OAAO,IAAI,CAAC;SACf;QACD,IAAI,CAAC,uBAAuB,CAAC,OAAO,CAAC,CAAC;QACtC,OAAO,GAAG,CAAC;IACf,CAAC;IAEO,uBAAuB,CAAC,OAAgB;QAC5C,MAAM,EAAE,aAAa,EAAE,GAAG,IAAI,CAAC;QAC/B,IAAI,kBAAkB,GAAG,KAAK,CAAC;QAC/B,IAAI,iBAAiB,GAAG,KAAK,CAAC;QAC9B,MAAM,UAAU,GAAG,EAAE,CAAC;QACtB,IAAI,SAAS,GAAiB,OAAO,CAAC,kBAAkB,CAAC;QACzD,OAAO,CAAC,iBAAiB,IAAI,SAAS,KAAK,IAAI,EAAE,SAAS,GAAG,OAAO,CAAC,kBAAkB,EAAE;YACrF,MAAM,OAAO,GAAG,aAAK,CAAC,YAAY,CAAC,SAAS,EAAE,aAAa,CAAC,CAAC;YAC7D,qBAAqB;YACrB,IAAI,CAAC,kBAAkB,IAAI,OAAO,EAAE;gBAChC,kBAAkB,GAAG,IAAI,CAAC;aAC7B;iBAAM,IAAI,kBAAkB,EAAE;gBAC3B,IAAI,aAAK,CAAC,sBAAsB,CAAC,SAAS,EAAE,aAAa,CAAC,EAAE;oBACxD,uEAAuE;oBACvE,OAAO,CAAC,gBAAgB,CAAC,SAAS,CAAC,CAAC;oBACpC,MAAM,oBAAoB,GAAG,OAAO,CAAC,kBAAkB,CAAC;oBACxD,2EAA2E;oBAC3E,0CAA0C;oBAC1C,IACI,oBAAoB,KAAK,IAAI;wBAC7B,CAAC,aAAK,CAAC,YAAY,CAAC,oBAAoB,EAAE,aAAa,CAAC;4BACpD,aAAK,CAAC,sBAAsB,CAAC,oBAAoB,EAAE,aAAa,CAAC,CAAC,EACxE;wBACE,UAAU,CAAC,IAAI,CAAC,oBAAoB,CAAC,KAAK,CAAC,CAAC;wBAC5C,SAAS,GAAG,oBAAoB,CAAC;qBACpC;yBAAM,IAAI,OAAO,EAAE;wBAChB,iFAAiF;wBACjF,iBAAiB,GAAG,IAAI,CAAC;qBAC5B;yBAAM;wBACH,oFAAoF;wBACpF,UAAU,CAAC,IAAI,CAAC,SAAS,CAAC,KAAK,CAAC,CAAC;qBACpC;iBACJ;qBAAM,IAAI,OAAO,EAAE;oBAChB,8BAA8B;oBAC9B,iBAAiB,GAAG,IAAI,CAAC;iBAC5B;qBAAM;oBACH,kCAAkC;oBAClC,UAAU,CAAC,IAAI,CAAC,SAAS,CAAC,KAAK,CAAC,CAAC;iBACpC;aACJ;YACD,OAAO,CAAC,gBAAgB,CAAC,SAAS,CAAC,CAAC;SACvC;QACD,OAAO,EAAE,GAAG,EAAE,IAAI,CAAC,eAAe,CAAC,MAAM,CAAC,UAAU,CAAC,IAAI,CAAC,EAAE,CAAC,CAAC,EAAE,iBAAiB,EAAE,CAAC;IACxF,CAAC;IAEO,uBAAuB,CAAC,OAAgB;QAC5C,MAAM,EAAE,aAAa,EAAE,GAAG,IAAI,CAAC;QAC/B,MAAM,EAAE,iBAAiB,EAAE,GAAG,OAAO,CAAC;QACtC,IAAI,iBAAiB,EAAE;YACnB,MAAM,qBAAqB,GAAG,aAAK,CAAC,gBAAgB,CAAC,iBAAiB,EAAE,aAAa,CAAC,CAAC;YACvF,MAAM,wBAAwB,GAAG,aAAK,CAAC,mBAAmB,CAAC,iBAAiB,CAAC,CAAC;YAC9E,IAAI,CAAC,CAAC,qBAAqB,IAAI,wBAAwB,CAAC,EAAE;gBACtD,iGAAiG;gBACjG,2CAA2C;gBAC3C,MAAM,WAAW,GAAG,OAAO,CAAC,cAAc,CAAC,MAAM,CAAC,CAAC,EAAE,EAAE,CAAC,CAAC,OAAO,CAAC,SAAS,EAAE,MAAM,CAAC,CAAC;gBACpF,MAAM,IAAI,KAAK,CACX,2BAA2B,aAAa,CAAC,gBAAgB,uBAAuB,iBAAiB,CAAC,KAAK,UAAU,WAAW,EAAE,CACjI,CAAC;aACL;YACD,OAAO,CAAC,cAAc,CAAC,iBAAiB,CAAC,CAAC;SAC7C;aAAM,IAAI,CAAC,OAAO,CAAC,WAAW,EAAE;YAC7B,OAAO,CAAC,eAAe,EAAE,CAAC;SAC7B;IACL,CAAC;CACJ;AAlGD,gDAkGC"}
|
||||
4
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/parser/column/index.d.ts
generated
vendored
Normal file
4
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/parser/column/index.d.ts
generated
vendored
Normal file
@@ -0,0 +1,4 @@
|
||||
export { ColumnParser } from './ColumnParser';
|
||||
export { NonQuotedColumnParser } from './NonQuotedColumnParser';
|
||||
export { QuotedColumnParser } from './QuotedColumnParser';
|
||||
export { ColumnFormatter } from './ColumnFormatter';
|
||||
12
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/parser/column/index.js
generated
vendored
Normal file
12
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/parser/column/index.js
generated
vendored
Normal file
@@ -0,0 +1,12 @@
|
||||
"use strict";
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.ColumnFormatter = exports.QuotedColumnParser = exports.NonQuotedColumnParser = exports.ColumnParser = void 0;
|
||||
var ColumnParser_1 = require("./ColumnParser");
|
||||
Object.defineProperty(exports, "ColumnParser", { enumerable: true, get: function () { return ColumnParser_1.ColumnParser; } });
|
||||
var NonQuotedColumnParser_1 = require("./NonQuotedColumnParser");
|
||||
Object.defineProperty(exports, "NonQuotedColumnParser", { enumerable: true, get: function () { return NonQuotedColumnParser_1.NonQuotedColumnParser; } });
|
||||
var QuotedColumnParser_1 = require("./QuotedColumnParser");
|
||||
Object.defineProperty(exports, "QuotedColumnParser", { enumerable: true, get: function () { return QuotedColumnParser_1.QuotedColumnParser; } });
|
||||
var ColumnFormatter_1 = require("./ColumnFormatter");
|
||||
Object.defineProperty(exports, "ColumnFormatter", { enumerable: true, get: function () { return ColumnFormatter_1.ColumnFormatter; } });
|
||||
//# sourceMappingURL=index.js.map
|
||||
1
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/parser/column/index.js.map
generated
vendored
Normal file
1
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/parser/column/index.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"index.js","sourceRoot":"","sources":["../../../../src/parser/column/index.ts"],"names":[],"mappings":";;;AAAA,+CAA8C;AAArC,4GAAA,YAAY,OAAA;AACrB,iEAAgE;AAAvD,8HAAA,qBAAqB,OAAA;AAC9B,2DAA0D;AAAjD,wHAAA,kBAAkB,OAAA;AAC3B,qDAAoD;AAA3C,kHAAA,eAAe,OAAA"}
|
||||
5
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/parser/index.d.ts
generated
vendored
Normal file
5
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/parser/index.d.ts
generated
vendored
Normal file
@@ -0,0 +1,5 @@
|
||||
export { Parser } from './Parser';
|
||||
export { RowParser } from './RowParser';
|
||||
export { Scanner } from './Scanner';
|
||||
export { Token, MaybeToken } from './Token';
|
||||
export { ColumnParser, NonQuotedColumnParser, QuotedColumnParser } from './column';
|
||||
16
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/parser/index.js
generated
vendored
Normal file
16
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/parser/index.js
generated
vendored
Normal file
@@ -0,0 +1,16 @@
|
||||
"use strict";
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.QuotedColumnParser = exports.NonQuotedColumnParser = exports.ColumnParser = exports.Token = exports.Scanner = exports.RowParser = exports.Parser = void 0;
|
||||
var Parser_1 = require("./Parser");
|
||||
Object.defineProperty(exports, "Parser", { enumerable: true, get: function () { return Parser_1.Parser; } });
|
||||
var RowParser_1 = require("./RowParser");
|
||||
Object.defineProperty(exports, "RowParser", { enumerable: true, get: function () { return RowParser_1.RowParser; } });
|
||||
var Scanner_1 = require("./Scanner");
|
||||
Object.defineProperty(exports, "Scanner", { enumerable: true, get: function () { return Scanner_1.Scanner; } });
|
||||
var Token_1 = require("./Token");
|
||||
Object.defineProperty(exports, "Token", { enumerable: true, get: function () { return Token_1.Token; } });
|
||||
var column_1 = require("./column");
|
||||
Object.defineProperty(exports, "ColumnParser", { enumerable: true, get: function () { return column_1.ColumnParser; } });
|
||||
Object.defineProperty(exports, "NonQuotedColumnParser", { enumerable: true, get: function () { return column_1.NonQuotedColumnParser; } });
|
||||
Object.defineProperty(exports, "QuotedColumnParser", { enumerable: true, get: function () { return column_1.QuotedColumnParser; } });
|
||||
//# sourceMappingURL=index.js.map
|
||||
1
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/parser/index.js.map
generated
vendored
Normal file
1
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/parser/index.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"index.js","sourceRoot":"","sources":["../../../src/parser/index.ts"],"names":[],"mappings":";;;AAAA,mCAAkC;AAAzB,gGAAA,MAAM,OAAA;AACf,yCAAwC;AAA/B,sGAAA,SAAS,OAAA;AAClB,qCAAoC;AAA3B,kGAAA,OAAO,OAAA;AAChB,iCAA4C;AAAnC,8FAAA,KAAK,OAAA;AACd,mCAAmF;AAA1E,sGAAA,YAAY,OAAA;AAAE,+GAAA,qBAAqB,OAAA;AAAE,4GAAA,kBAAkB,OAAA"}
|
||||
17
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/transforms/HeaderTransformer.d.ts
generated
vendored
Normal file
17
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/transforms/HeaderTransformer.d.ts
generated
vendored
Normal file
@@ -0,0 +1,17 @@
|
||||
import { ParserOptions } from '../ParserOptions';
|
||||
import { HeaderArray, Row, RowArray, RowValidatorCallback } from '../types';
|
||||
export declare class HeaderTransformer<O extends Row> {
|
||||
private readonly parserOptions;
|
||||
headers: HeaderArray | null;
|
||||
private receivedHeaders;
|
||||
private readonly shouldUseFirstRow;
|
||||
private processedFirstRow;
|
||||
private headersLength;
|
||||
private readonly headersTransform?;
|
||||
constructor(parserOptions: ParserOptions);
|
||||
transform(row: RowArray, cb: RowValidatorCallback<O>): void;
|
||||
private shouldMapRow;
|
||||
private processRow;
|
||||
private mapHeaders;
|
||||
private setHeaders;
|
||||
}
|
||||
115
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/transforms/HeaderTransformer.js
generated
vendored
Normal file
115
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/transforms/HeaderTransformer.js
generated
vendored
Normal file
@@ -0,0 +1,115 @@
|
||||
"use strict";
|
||||
var __importDefault = (this && this.__importDefault) || function (mod) {
|
||||
return (mod && mod.__esModule) ? mod : { "default": mod };
|
||||
};
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.HeaderTransformer = void 0;
|
||||
const lodash_isundefined_1 = __importDefault(require("lodash.isundefined"));
|
||||
const lodash_isfunction_1 = __importDefault(require("lodash.isfunction"));
|
||||
const lodash_uniq_1 = __importDefault(require("lodash.uniq"));
|
||||
const lodash_groupby_1 = __importDefault(require("lodash.groupby"));
|
||||
class HeaderTransformer {
|
||||
constructor(parserOptions) {
|
||||
this.headers = null;
|
||||
this.receivedHeaders = false;
|
||||
this.shouldUseFirstRow = false;
|
||||
this.processedFirstRow = false;
|
||||
this.headersLength = 0;
|
||||
this.parserOptions = parserOptions;
|
||||
if (parserOptions.headers === true) {
|
||||
this.shouldUseFirstRow = true;
|
||||
}
|
||||
else if (Array.isArray(parserOptions.headers)) {
|
||||
this.setHeaders(parserOptions.headers);
|
||||
}
|
||||
else if (lodash_isfunction_1.default(parserOptions.headers)) {
|
||||
this.headersTransform = parserOptions.headers;
|
||||
}
|
||||
}
|
||||
transform(row, cb) {
|
||||
if (!this.shouldMapRow(row)) {
|
||||
return cb(null, { row: null, isValid: true });
|
||||
}
|
||||
return cb(null, this.processRow(row));
|
||||
}
|
||||
shouldMapRow(row) {
|
||||
const { parserOptions } = this;
|
||||
if (!this.headersTransform && parserOptions.renameHeaders && !this.processedFirstRow) {
|
||||
if (!this.receivedHeaders) {
|
||||
throw new Error('Error renaming headers: new headers must be provided in an array');
|
||||
}
|
||||
this.processedFirstRow = true;
|
||||
return false;
|
||||
}
|
||||
if (!this.receivedHeaders && Array.isArray(row)) {
|
||||
if (this.headersTransform) {
|
||||
this.setHeaders(this.headersTransform(row));
|
||||
}
|
||||
else if (this.shouldUseFirstRow) {
|
||||
this.setHeaders(row);
|
||||
}
|
||||
else {
|
||||
// dont do anything with the headers if we didnt receive a transform or shouldnt use the first row.
|
||||
return true;
|
||||
}
|
||||
return false;
|
||||
}
|
||||
return true;
|
||||
}
|
||||
processRow(row) {
|
||||
if (!this.headers) {
|
||||
return { row: row, isValid: true };
|
||||
}
|
||||
const { parserOptions } = this;
|
||||
if (!parserOptions.discardUnmappedColumns && row.length > this.headersLength) {
|
||||
if (!parserOptions.strictColumnHandling) {
|
||||
throw new Error(`Unexpected Error: column header mismatch expected: ${this.headersLength} columns got: ${row.length}`);
|
||||
}
|
||||
return {
|
||||
row: row,
|
||||
isValid: false,
|
||||
reason: `Column header mismatch expected: ${this.headersLength} columns got: ${row.length}`,
|
||||
};
|
||||
}
|
||||
if (parserOptions.strictColumnHandling && row.length < this.headersLength) {
|
||||
return {
|
||||
row: row,
|
||||
isValid: false,
|
||||
reason: `Column header mismatch expected: ${this.headersLength} columns got: ${row.length}`,
|
||||
};
|
||||
}
|
||||
return { row: this.mapHeaders(row), isValid: true };
|
||||
}
|
||||
mapHeaders(row) {
|
||||
const rowMap = {};
|
||||
const { headers, headersLength } = this;
|
||||
for (let i = 0; i < headersLength; i += 1) {
|
||||
const header = headers[i];
|
||||
if (!lodash_isundefined_1.default(header)) {
|
||||
const val = row[i];
|
||||
// eslint-disable-next-line no-param-reassign
|
||||
if (lodash_isundefined_1.default(val)) {
|
||||
rowMap[header] = '';
|
||||
}
|
||||
else {
|
||||
rowMap[header] = val;
|
||||
}
|
||||
}
|
||||
}
|
||||
return rowMap;
|
||||
}
|
||||
setHeaders(headers) {
|
||||
var _a;
|
||||
const filteredHeaders = headers.filter((h) => !!h);
|
||||
if (lodash_uniq_1.default(filteredHeaders).length !== filteredHeaders.length) {
|
||||
const grouped = lodash_groupby_1.default(filteredHeaders);
|
||||
const duplicates = Object.keys(grouped).filter((dup) => grouped[dup].length > 1);
|
||||
throw new Error(`Duplicate headers found ${JSON.stringify(duplicates)}`);
|
||||
}
|
||||
this.headers = headers;
|
||||
this.receivedHeaders = true;
|
||||
this.headersLength = ((_a = this.headers) === null || _a === void 0 ? void 0 : _a.length) || 0;
|
||||
}
|
||||
}
|
||||
exports.HeaderTransformer = HeaderTransformer;
|
||||
//# sourceMappingURL=HeaderTransformer.js.map
|
||||
1
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/transforms/HeaderTransformer.js.map
generated
vendored
Normal file
1
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/transforms/HeaderTransformer.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"HeaderTransformer.js","sourceRoot":"","sources":["../../../src/transforms/HeaderTransformer.ts"],"names":[],"mappings":";;;;;;AAAA,4EAA6C;AAC7C,0EAA2C;AAC3C,8DAA+B;AAC/B,oEAAqC;AAYrC,MAAa,iBAAiB;IAe1B,YAAmB,aAA4B;QAZ/C,YAAO,GAAuB,IAAI,CAAC;QAE3B,oBAAe,GAAG,KAAK,CAAC;QAEf,sBAAiB,GAAY,KAAK,CAAC;QAE5C,sBAAiB,GAAG,KAAK,CAAC;QAE1B,kBAAa,GAAG,CAAC,CAAC;QAKtB,IAAI,CAAC,aAAa,GAAG,aAAa,CAAC;QACnC,IAAI,aAAa,CAAC,OAAO,KAAK,IAAI,EAAE;YAChC,IAAI,CAAC,iBAAiB,GAAG,IAAI,CAAC;SACjC;aAAM,IAAI,KAAK,CAAC,OAAO,CAAC,aAAa,CAAC,OAAO,CAAC,EAAE;YAC7C,IAAI,CAAC,UAAU,CAAC,aAAa,CAAC,OAAO,CAAC,CAAC;SAC1C;aAAM,IAAI,2BAAU,CAAC,aAAa,CAAC,OAAO,CAAC,EAAE;YAC1C,IAAI,CAAC,gBAAgB,GAAG,aAAa,CAAC,OAAO,CAAC;SACjD;IACL,CAAC;IAEM,SAAS,CAAC,GAAa,EAAE,EAA2B;QACvD,IAAI,CAAC,IAAI,CAAC,YAAY,CAAC,GAAG,CAAC,EAAE;YACzB,OAAO,EAAE,CAAC,IAAI,EAAE,EAAE,GAAG,EAAE,IAAI,EAAE,OAAO,EAAE,IAAI,EAAE,CAAC,CAAC;SACjD;QACD,OAAO,EAAE,CAAC,IAAI,EAAE,IAAI,CAAC,UAAU,CAAC,GAAG,CAAC,CAAC,CAAC;IAC1C,CAAC;IAEO,YAAY,CAAC,GAAQ;QACzB,MAAM,EAAE,aAAa,EAAE,GAAG,IAAI,CAAC;QAC/B,IAAI,CAAC,IAAI,CAAC,gBAAgB,IAAI,aAAa,CAAC,aAAa,IAAI,CAAC,IAAI,CAAC,iBAAiB,EAAE;YAClF,IAAI,CAAC,IAAI,CAAC,eAAe,EAAE;gBACvB,MAAM,IAAI,KAAK,CAAC,kEAAkE,CAAC,CAAC;aACvF;YACD,IAAI,CAAC,iBAAiB,GAAG,IAAI,CAAC;YAC9B,OAAO,KAAK,CAAC;SAChB;QACD,IAAI,CAAC,IAAI,CAAC,eAAe,IAAI,KAAK,CAAC,OAAO,CAAC,GAAG,CAAC,EAAE;YAC7C,IAAI,IAAI,CAAC,gBAAgB,EAAE;gBACvB,IAAI,CAAC,UAAU,CAAC,IAAI,CAAC,gBAAgB,CAAC,GAAG,CAAC,CAAC,CAAC;aAC/C;iBAAM,IAAI,IAAI,CAAC,iBAAiB,EAAE;gBAC/B,IAAI,CAAC,UAAU,CAAC,GAAG,CAAC,CAAC;aACxB;iBAAM;gBACH,mGAAmG;gBACnG,OAAO,IAAI,CAAC;aACf;YACD,OAAO,KAAK,CAAC;SAChB;QACD,OAAO,IAAI,CAAC;IAChB,CAAC;IAEO,UAAU,CAAC,GAAqB;QACpC,IAAI,CAAC,IAAI,CAAC,OAAO,EAAE;YACf,OAAO,EAAE,GAAG,EAAG,GAAkB,EAAE,OAAO,EAAE,IAAI,EAAE,CAAC;SACtD;QACD,MAAM,EAAE,aAAa,EAAE,GAAG,IAAI,CAAC;QAC/B,IAAI,CAAC,aAAa,CAAC,sBAAsB,IAAI,GAAG,CAAC,MAAM,GAAG,IAAI,CAAC,aAAa,EAAE;YAC1E,IAAI,CAAC,aAAa,CAAC,oBAAoB,EAAE;gBACrC,MAAM,IAAI,KAAK,CACX,sDAAsD,IAAI,CAAC,aAAa,iBAAiB,GAAG,CAAC,MAAM,EAAE,CACxG,CAAC;aACL;YACD,OAAO;gBACH,GAAG,EAAG,GAAkB;gBACxB,OAAO,EAAE,KAAK;gBACd,MAAM,EAAE,oCAAoC,IAAI,CAAC,aAAa,iBAAiB,GAAG,CAAC,MAAM,EAAE;aAC9F,CAAC;SACL;QACD,IAAI,aAAa,CAAC,oBAAoB,IAAI,GAAG,CAAC,MAAM,GAAG,IAAI,CAAC,aAAa,EAAE;YACvE,OAAO;gBACH,GAAG,EAAG,GAAkB;gBACxB,OAAO,EAAE,KAAK;gBACd,MAAM,EAAE,oCAAoC,IAAI,CAAC,aAAa,iBAAiB,GAAG,CAAC,MAAM,EAAE;aAC9F,CAAC;SACL;QACD,OAAO,EAAE,GAAG,EAAE,IAAI,CAAC,UAAU,CAAC,GAAG,CAAC,EAAE,OAAO,EAAE,IAAI,EAAE,CAAC;IACxD,CAAC;IAEO,UAAU,CAAC,GAAqB;QACpC,MAAM,MAAM,GAAW,EAAE,CAAC;QAC1B,MAAM,EAAE,OAAO,EAAE,aAAa,EAAE,GAAG,IAAI,CAAC;QACxC,KAAK,IAAI,CAAC,GAAG,CAAC,EAAE,CAAC,GAAG,aAAa,EAAE,CAAC,IAAI,CAAC,EAAE;YACvC,MAAM,MAAM,GAAI,OAAoB,CAAC,CAAC,CAAC,CAAC;YACxC,IAAI,CAAC,4BAAW,CAAC,MAAM,CAAC,EAAE;gBACtB,MAAM,GAAG,GAAG,GAAG,CAAC,CAAC,CAAC,CAAC;gBACnB,6CAA6C;gBAC7C,IAAI,4BAAW,CAAC,GAAG,CAAC,EAAE;oBAClB,MAAM,CAAC,MAAM,CAAC,GAAG,EAAE,CAAC;iBACvB;qBAAM;oBACH,MAAM,CAAC,MAAM,CAAC,GAAG,GAAG,CAAC;iBACxB;aACJ;SACJ;QACD,OAAO,MAAW,CAAC;IACvB,CAAC;IAEO,UAAU,CAAC,OAAoB;;QACnC,MAAM,eAAe,GAAG,OAAO,CAAC,MAAM,CAAC,CAAC,CAAC,EAAE,EAAE,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC;QACnD,IAAI,qBAAI,CAAC,eAAe,CAAC,CAAC,MAAM,KAAK,eAAe,CAAC,MAAM,EAAE;YACzD,MAAM,OAAO,GAAG,wBAAO,CAAC,eAAe,CAAC,CAAC;YACzC,MAAM,UAAU,GAAG,MAAM,CAAC,IAAI,CAAC,OAAO,CAAC,CAAC,MAAM,CAAC,CAAC,GAAG,EAAE,EAAE,CAAC,OAAO,CAAC,GAAG,CAAC,CAAC,MAAM,GAAG,CAAC,CAAC,CAAC;YACjF,MAAM,IAAI,KAAK,CAAC,2BAA2B,IAAI,CAAC,SAAS,CAAC,UAAU,CAAC,EAAE,CAAC,CAAC;SAC5E;QACD,IAAI,CAAC,OAAO,GAAG,OAAO,CAAC;QACvB,IAAI,CAAC,eAAe,GAAG,IAAI,CAAC;QAC5B,IAAI,CAAC,aAAa,GAAG,OAAA,IAAI,CAAC,OAAO,0CAAE,MAAM,KAAI,CAAC,CAAC;IACnD,CAAC;CACJ;AAhHD,8CAgHC"}
|
||||
12
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/transforms/RowTransformerValidator.d.ts
generated
vendored
Normal file
12
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/transforms/RowTransformerValidator.d.ts
generated
vendored
Normal file
@@ -0,0 +1,12 @@
|
||||
import { Row, RowTransformFunction, RowValidatorCallback, RowValidate } from '../types';
|
||||
export declare class RowTransformerValidator<I extends Row, O extends Row> {
|
||||
private static createTransform;
|
||||
private static createValidator;
|
||||
private _rowTransform;
|
||||
private _rowValidator;
|
||||
set rowTransform(transformFunction: RowTransformFunction<I, O>);
|
||||
set rowValidator(validateFunction: RowValidate<O>);
|
||||
transformAndValidate(row: I, cb: RowValidatorCallback<O>): void;
|
||||
private callTransformer;
|
||||
private callValidator;
|
||||
}
|
||||
93
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/transforms/RowTransformerValidator.js
generated
vendored
Normal file
93
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/transforms/RowTransformerValidator.js
generated
vendored
Normal file
@@ -0,0 +1,93 @@
|
||||
"use strict";
|
||||
var __importDefault = (this && this.__importDefault) || function (mod) {
|
||||
return (mod && mod.__esModule) ? mod : { "default": mod };
|
||||
};
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.RowTransformerValidator = void 0;
|
||||
const lodash_isfunction_1 = __importDefault(require("lodash.isfunction"));
|
||||
const types_1 = require("../types");
|
||||
class RowTransformerValidator {
|
||||
constructor() {
|
||||
this._rowTransform = null;
|
||||
this._rowValidator = null;
|
||||
}
|
||||
// eslint-disable-next-line @typescript-eslint/no-shadow
|
||||
static createTransform(transformFunction) {
|
||||
if (types_1.isSyncTransform(transformFunction)) {
|
||||
return (row, cb) => {
|
||||
let transformed = null;
|
||||
try {
|
||||
transformed = transformFunction(row);
|
||||
}
|
||||
catch (e) {
|
||||
return cb(e);
|
||||
}
|
||||
return cb(null, transformed);
|
||||
};
|
||||
}
|
||||
return transformFunction;
|
||||
}
|
||||
static createValidator(validateFunction) {
|
||||
if (types_1.isSyncValidate(validateFunction)) {
|
||||
return (row, cb) => {
|
||||
cb(null, { row, isValid: validateFunction(row) });
|
||||
};
|
||||
}
|
||||
return (row, cb) => {
|
||||
validateFunction(row, (err, isValid, reason) => {
|
||||
if (err) {
|
||||
return cb(err);
|
||||
}
|
||||
if (isValid) {
|
||||
return cb(null, { row, isValid, reason });
|
||||
}
|
||||
return cb(null, { row, isValid: false, reason });
|
||||
});
|
||||
};
|
||||
}
|
||||
set rowTransform(transformFunction) {
|
||||
if (!lodash_isfunction_1.default(transformFunction)) {
|
||||
throw new TypeError('The transform should be a function');
|
||||
}
|
||||
this._rowTransform = RowTransformerValidator.createTransform(transformFunction);
|
||||
}
|
||||
set rowValidator(validateFunction) {
|
||||
if (!lodash_isfunction_1.default(validateFunction)) {
|
||||
throw new TypeError('The validate should be a function');
|
||||
}
|
||||
this._rowValidator = RowTransformerValidator.createValidator(validateFunction);
|
||||
}
|
||||
transformAndValidate(row, cb) {
|
||||
return this.callTransformer(row, (transformErr, transformedRow) => {
|
||||
if (transformErr) {
|
||||
return cb(transformErr);
|
||||
}
|
||||
if (!transformedRow) {
|
||||
return cb(null, { row: null, isValid: true });
|
||||
}
|
||||
return this.callValidator(transformedRow, (validateErr, validationResult) => {
|
||||
if (validateErr) {
|
||||
return cb(validateErr);
|
||||
}
|
||||
if (validationResult && !validationResult.isValid) {
|
||||
return cb(null, { row: transformedRow, isValid: false, reason: validationResult.reason });
|
||||
}
|
||||
return cb(null, { row: transformedRow, isValid: true });
|
||||
});
|
||||
});
|
||||
}
|
||||
callTransformer(row, cb) {
|
||||
if (!this._rowTransform) {
|
||||
return cb(null, row);
|
||||
}
|
||||
return this._rowTransform(row, cb);
|
||||
}
|
||||
callValidator(row, cb) {
|
||||
if (!this._rowValidator) {
|
||||
return cb(null, { row, isValid: true });
|
||||
}
|
||||
return this._rowValidator(row, cb);
|
||||
}
|
||||
}
|
||||
exports.RowTransformerValidator = RowTransformerValidator;
|
||||
//# sourceMappingURL=RowTransformerValidator.js.map
|
||||
1
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/transforms/RowTransformerValidator.js.map
generated
vendored
Normal file
1
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/transforms/RowTransformerValidator.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"RowTransformerValidator.js","sourceRoot":"","sources":["../../../src/transforms/RowTransformerValidator.ts"],"names":[],"mappings":";;;;;;AAAA,0EAA2C;AAC3C,oCASkB;AAIlB,MAAa,uBAAuB;IAApC;QAsCY,kBAAa,GAAmC,IAAI,CAAC;QAErD,kBAAa,GAA2B,IAAI,CAAC;IAiDzD,CAAC;IAxFG,wDAAwD;IAChD,MAAM,CAAC,eAAe,CAC1B,iBAA6C;QAE7C,IAAI,uBAAe,CAAC,iBAAiB,CAAC,EAAE;YACpC,OAAO,CAAC,GAAM,EAAE,EAA2B,EAAQ,EAAE;gBACjD,IAAI,WAAW,GAAa,IAAI,CAAC;gBACjC,IAAI;oBACA,WAAW,GAAG,iBAAiB,CAAC,GAAG,CAAC,CAAC;iBACxC;gBAAC,OAAO,CAAC,EAAE;oBACR,OAAO,EAAE,CAAC,CAAC,CAAC,CAAC;iBAChB;gBACD,OAAO,EAAE,CAAC,IAAI,EAAE,WAAW,CAAC,CAAC;YACjC,CAAC,CAAC;SACL;QACD,OAAO,iBAAiB,CAAC;IAC7B,CAAC;IAEO,MAAM,CAAC,eAAe,CAAgB,gBAAgC;QAC1E,IAAI,sBAAc,CAAC,gBAAgB,CAAC,EAAE;YAClC,OAAO,CAAC,GAAM,EAAE,EAA2B,EAAQ,EAAE;gBACjD,EAAE,CAAC,IAAI,EAAE,EAAE,GAAG,EAAE,OAAO,EAAE,gBAAgB,CAAC,GAAG,CAAC,EAAE,CAAC,CAAC;YACtD,CAAC,CAAC;SACL;QACD,OAAO,CAAC,GAAM,EAAE,EAA2B,EAAQ,EAAE;YACjD,gBAAgB,CAAC,GAAG,EAAE,CAAC,GAAG,EAAE,OAAO,EAAE,MAAM,EAAQ,EAAE;gBACjD,IAAI,GAAG,EAAE;oBACL,OAAO,EAAE,CAAC,GAAG,CAAC,CAAC;iBAClB;gBACD,IAAI,OAAO,EAAE;oBACT,OAAO,EAAE,CAAC,IAAI,EAAE,EAAE,GAAG,EAAE,OAAO,EAAE,MAAM,EAAE,CAAC,CAAC;iBAC7C;gBACD,OAAO,EAAE,CAAC,IAAI,EAAE,EAAE,GAAG,EAAE,OAAO,EAAE,KAAK,EAAE,MAAM,EAAE,CAAC,CAAC;YACrD,CAAC,CAAC,CAAC;QACP,CAAC,CAAC;IACN,CAAC;IAMD,IAAW,YAAY,CAAC,iBAA6C;QACjE,IAAI,CAAC,2BAAU,CAAC,iBAAiB,CAAC,EAAE;YAChC,MAAM,IAAI,SAAS,CAAC,oCAAoC,CAAC,CAAC;SAC7D;QACD,IAAI,CAAC,aAAa,GAAG,uBAAuB,CAAC,eAAe,CAAC,iBAAiB,CAAC,CAAC;IACpF,CAAC;IAED,IAAW,YAAY,CAAC,gBAAgC;QACpD,IAAI,CAAC,2BAAU,CAAC,gBAAgB,CAAC,EAAE;YAC/B,MAAM,IAAI,SAAS,CAAC,mCAAmC,CAAC,CAAC;SAC5D;QACD,IAAI,CAAC,aAAa,GAAG,uBAAuB,CAAC,eAAe,CAAC,gBAAgB,CAAC,CAAC;IACnF,CAAC;IAEM,oBAAoB,CAAC,GAAM,EAAE,EAA2B;QAC3D,OAAO,IAAI,CAAC,eAAe,CAAC,GAAG,EAAE,CAAC,YAAY,EAAE,cAAc,EAAQ,EAAE;YACpE,IAAI,YAAY,EAAE;gBACd,OAAO,EAAE,CAAC,YAAY,CAAC,CAAC;aAC3B;YACD,IAAI,CAAC,cAAc,EAAE;gBACjB,OAAO,EAAE,CAAC,IAAI,EAAE,EAAE,GAAG,EAAE,IAAI,EAAE,OAAO,EAAE,IAAI,EAAE,CAAC,CAAC;aACjD;YACD,OAAO,IAAI,CAAC,aAAa,CAAC,cAAc,EAAE,CAAC,WAAW,EAAE,gBAAgB,EAAQ,EAAE;gBAC9E,IAAI,WAAW,EAAE;oBACb,OAAO,EAAE,CAAC,WAAW,CAAC,CAAC;iBAC1B;gBACD,IAAI,gBAAgB,IAAI,CAAC,gBAAgB,CAAC,OAAO,EAAE;oBAC/C,OAAO,EAAE,CAAC,IAAI,EAAE,EAAE,GAAG,EAAE,cAAc,EAAE,OAAO,EAAE,KAAK,EAAE,MAAM,EAAE,gBAAgB,CAAC,MAAM,EAAE,CAAC,CAAC;iBAC7F;gBACD,OAAO,EAAE,CAAC,IAAI,EAAE,EAAE,GAAG,EAAE,cAAc,EAAE,OAAO,EAAE,IAAI,EAAE,CAAC,CAAC;YAC5D,CAAC,CAAC,CAAC;QACP,CAAC,CAAC,CAAC;IACP,CAAC;IAEO,eAAe,CAAC,GAAM,EAAE,EAA2B;QACvD,IAAI,CAAC,IAAI,CAAC,aAAa,EAAE;YACrB,OAAO,EAAE,CAAC,IAAI,EAAG,GAAkB,CAAC,CAAC;SACxC;QACD,OAAO,IAAI,CAAC,aAAa,CAAC,GAAG,EAAE,EAAE,CAAC,CAAC;IACvC,CAAC;IAEO,aAAa,CAAC,GAAM,EAAE,EAA2B;QACrD,IAAI,CAAC,IAAI,CAAC,aAAa,EAAE;YACrB,OAAO,EAAE,CAAC,IAAI,EAAE,EAAE,GAAG,EAAE,OAAO,EAAE,IAAI,EAAE,CAAC,CAAC;SAC3C;QACD,OAAO,IAAI,CAAC,aAAa,CAAC,GAAG,EAAE,EAAE,CAAC,CAAC;IACvC,CAAC;CACJ;AAzFD,0DAyFC"}
|
||||
2
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/transforms/index.d.ts
generated
vendored
Normal file
2
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/transforms/index.d.ts
generated
vendored
Normal file
@@ -0,0 +1,2 @@
|
||||
export { RowTransformerValidator } from './RowTransformerValidator';
|
||||
export { HeaderTransformer } from './HeaderTransformer';
|
||||
8
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/transforms/index.js
generated
vendored
Normal file
8
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/transforms/index.js
generated
vendored
Normal file
@@ -0,0 +1,8 @@
|
||||
"use strict";
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.HeaderTransformer = exports.RowTransformerValidator = void 0;
|
||||
var RowTransformerValidator_1 = require("./RowTransformerValidator");
|
||||
Object.defineProperty(exports, "RowTransformerValidator", { enumerable: true, get: function () { return RowTransformerValidator_1.RowTransformerValidator; } });
|
||||
var HeaderTransformer_1 = require("./HeaderTransformer");
|
||||
Object.defineProperty(exports, "HeaderTransformer", { enumerable: true, get: function () { return HeaderTransformer_1.HeaderTransformer; } });
|
||||
//# sourceMappingURL=index.js.map
|
||||
1
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/transforms/index.js.map
generated
vendored
Normal file
1
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/transforms/index.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"index.js","sourceRoot":"","sources":["../../../src/transforms/index.ts"],"names":[],"mappings":";;;AAAA,qEAAoE;AAA3D,kIAAA,uBAAuB,OAAA;AAChC,yDAAwD;AAA/C,sHAAA,iBAAiB,OAAA"}
|
||||
21
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/types.d.ts
generated
vendored
Normal file
21
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/types.d.ts
generated
vendored
Normal file
@@ -0,0 +1,21 @@
|
||||
export declare type RowMap<V = any> = Record<string, V>;
|
||||
export declare type RowArray<V = any> = V[];
|
||||
export declare type Row<V = any> = RowMap<V> | RowArray<V>;
|
||||
export interface RowValidationResult<R extends Row> {
|
||||
row: R | null;
|
||||
isValid: boolean;
|
||||
reason?: string;
|
||||
}
|
||||
export declare type RowValidatorCallback<R extends Row> = (error: Error | null, result?: RowValidationResult<R>) => void;
|
||||
export declare type RowTransformCallback<R extends Row> = (error?: Error | null, row?: R) => void;
|
||||
export declare type SyncRowTransform<I extends Row, O extends Row> = (row: I) => O;
|
||||
export declare type AsyncRowTransform<I extends Row, O extends Row> = (row: I, cb: RowTransformCallback<O>) => void;
|
||||
export declare type RowTransformFunction<I extends Row, O extends Row> = SyncRowTransform<I, O> | AsyncRowTransform<I, O>;
|
||||
export declare const isSyncTransform: <I extends Row<any>, O extends Row<any>>(transform: RowTransformFunction<I, O>) => transform is SyncRowTransform<I, O>;
|
||||
export declare type RowValidateCallback = (error?: Error | null, isValid?: boolean, reason?: string) => void;
|
||||
export declare type SyncRowValidate<R extends Row> = (row: R) => boolean;
|
||||
export declare type AsyncRowValidate<R extends Row> = (row: R, cb: RowValidateCallback) => void;
|
||||
export declare type RowValidate<R extends Row> = AsyncRowValidate<R> | SyncRowValidate<R>;
|
||||
export declare const isSyncValidate: <R extends Row<any>>(validate: RowValidate<R>) => validate is SyncRowValidate<R>;
|
||||
export declare type HeaderArray = (string | undefined | null)[];
|
||||
export declare type HeaderTransformFunction = (headers: HeaderArray) => HeaderArray;
|
||||
6
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/types.js
generated
vendored
Normal file
6
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/types.js
generated
vendored
Normal file
@@ -0,0 +1,6 @@
|
||||
"use strict";
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.isSyncValidate = exports.isSyncTransform = void 0;
|
||||
exports.isSyncTransform = (transform) => transform.length === 1;
|
||||
exports.isSyncValidate = (validate) => validate.length === 1;
|
||||
//# sourceMappingURL=types.js.map
|
||||
1
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/types.js.map
generated
vendored
Normal file
1
doc/test-data/purchase_transaction/node_modules/@fast-csv/parse/build/src/types.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"types.js","sourceRoot":"","sources":["../../src/types.ts"],"names":[],"mappings":";;;AAoBa,QAAA,eAAe,GAAG,CAC3B,SAAqC,EACF,EAAE,CAAC,SAAS,CAAC,MAAM,KAAK,CAAC,CAAC;AAQpD,QAAA,cAAc,GAAG,CAAgB,QAAwB,EAAkC,EAAE,CACtG,QAAQ,CAAC,MAAM,KAAK,CAAC,CAAC"}
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user