CRM/CSV
Learn how to manage and import CRM data using CSV files for efficient data processing.
Overview
Plasma provides powerful tools for managing customer relationship data through CSV files and direct CRM integrations.
Prepare Your Data
Ensure your CSV files follow the correct format:
customers.csv
contacts.csv
leads.csv
companies.csv
Example CSV structure:
id,name,email,phone,company,status,created_date
1,John Doe,john@example.com,+1234567890,Acme Corp,active,2024-01-15
2,Jane Smith,jane@example.com,+0987654321,Tech Ltd,inactive,2024-01-16
Import Data
Use Plasma's import functionality:
import { csvImporter } from 'plasma';
const result = await csvImporter.import('customers.csv', {
mapping: {
'Customer Name': 'name',
'Email Address': 'email',
'Phone Number': 'phone',
},
validation: true,
batchSize: 1000,
});
Validate & Process
Review imported data and handle errors:
if (result.errors.length > 0) {
console.log('Import errors:', result.errors);
}
console.log(`Imported ${result.successful} records`);
Data Formats
Customer CSV Format
Required fields for customer data:
Prop | Type | Default |
---|---|---|
id | string | number | - |
name | string | - |
email | string | - |
phone? | string | - |
company? | string | - |
status? | "active" | "inactive" | "prospect" | "prospect" |
Lead CSV Format
id,name,email,source,score,stage,assigned_to
1,Alice Johnson,alice@test.com,website,85,qualified,john.doe
2,Bob Wilson,bob@test.com,referral,65,contacted,jane.smith
Lead scoring integration:
const leadProcessor = await csvImporter.processLeads('leads.csv', {
scoreThreshold: 70,
autoAssign: true,
notifyAssignee: true,
});
Company CSV Format
id,name,domain,industry,size,revenue,location
1,Acme Corp,acme.com,Technology,500,10000000,New York
2,Tech Solutions,techsol.com,Consulting,50,2000000,California
Contact CSV Format
id,first_name,last_name,email,title,company_id,department
1,John,Doe,john@acme.com,Manager,1,Sales
2,Jane,Smith,jane@acme.com,Developer,1,Engineering
CRM Integration
Data Validation
Ensure data quality with built-in validation:
import { validator } from 'plasma';
const schema = {
email: validator.email(),
phone: validator.phone(),
name: validator.string().min(2).max(100),
company: validator.string().optional(),
status: validator.enum(['active', 'inactive', 'prospect']),
};
const validationResult = await csvImporter.validate('customers.csv', schema);
if (!validationResult.isValid) {
console.log('Validation errors:', validationResult.errors);
}
Export Features
// Export filtered customer data
const activeCustomers = await crm.getCustomers({
status: 'active',
lastActivity: { since: '2024-01-01' }
});
// Export to CSV
await csvExporter.export(activeCustomers, 'active-customers.csv');
// Export to Excel
await excelExporter.export(activeCustomers, 'active-customers.xlsx');
// Export to JSON
await jsonExporter.export(activeCustomers, 'active-customers.json');
Batch Processing
Handle large datasets efficiently:
const batchProcessor = csvImporter.createBatchProcessor({
batchSize: 1000,
concurrency: 5,
retries: 3
});
batchProcessor.onProgress((progress) => {
console.log(`Processed ${progress.completed}/${progress.total} records`);
});
batchProcessor.onError((error, batch) => {
console.error(`Batch ${batch.id} failed:`, error);
});
await batchProcessor.process('large-dataset.csv');
Best Practices
Performance Tip: Use batch processing for files larger than 10,000 records to maintain optimal performance.
- Validate data before import
- Use consistent field naming
- Include unique identifiers
- Implement error handling
- Monitor import progress
- Backup data before bulk operations