๐ Import Logging Utilities
๐ Overviewโ
The log_data.js utility provides comprehensive logging functionality for CSV import operations. It tracks success/failure counts, stores error details with line numbers, updates import result documents, and emits real-time socket notifications to users.
Source File: queue-manager/common/log_data.js
๐ฏ Purposeโ
- Progress Tracking: Maintains running count of successful/failed imports
- Error Logging: Captures detailed error messages with CSV row context
- Result Persistence: Stores import results in MongoDB collections
- Real-time Updates: Emits socket events for live import progress (currently commented out)
- Audit Trail: Links errors to specific CSV rows for user troubleshooting
๐ Functionsโ
1. Contact Import Logging (Default Export)โ
module.exports = async(
contact,
totalRecordsAdded,
totalErrors,
logId,
crmContacts,
owner,
uid,
logs,
csvContact,
type,
account_id,
);
Parametersโ
- contact (
Object) - Result from bulk insert operationsavedContacts- Array of successfully inserted contactserrors- Array of error objects with positions
- totalRecordsAdded (
Number) - Running count of successful inserts - totalErrors (
Number) - Running count of failed inserts - logId (
ObjectId) - Existing log document ID (or null for first call) - crmContacts (
Array) - Contact objects that were attempted - owner (
ObjectId) - User ID who initiated import - uid (
String) - User ID string for socket notifications - logs (
Array) - Accumulated error logs for final report - csvContact (
Object) - Original CSV row data - type (
String) - Import type:'contact','business', or'both' - account_id (
ObjectId) - Account ID for the import
Returnsโ
{
contact, totalRecordsAdded, totalErrors, logId, crmContacts, logData, logs;
}
2. Account Import Loggingโ
module.exports.accountLog = async(
account,
totalRecordsAdded,
totalErrors,
logId,
accountInfo,
owner,
uid,
logs,
csvAccount,
type,
account_id,
);
Parametersโ
Similar to contact logging but for account imports:
- account - Result from account bulk insert
- accountInfo - Account objects that were attempted
Returnsโ
{
account, totalRecordsAdded, totalErrors, logId, accountInfo, logData, logs;
}
๐ง Implementation Detailsโ
Success Countingโ
if (contact.savedContacts?.length) {
totalRecordsAdded++;
}
if (contact.errors?.length) {
totalErrors++;
}
Logic:
- Increments
totalRecordsAddedif any contacts were saved - Increments
totalErrorsif any errors occurred - Accumulates across multiple job processing iterations
- Provides real-time progress during bulk import
Log Document Creationโ
let isLogExist = false;
if (logId) {
isLogExist = await ContactImportResults.findOne({ _id: logId });
}
Logic:
- Checks if log document already exists
- First job creates new log document
- Subsequent jobs update existing document
- Enables incremental updates during processing
Error Processing and Cleanupโ
if (contact.errors?.length) {
for (let err of contact.errors) {
if (crmContacts[err.position]) {
delete crmContacts[err.position][`imported_${type}`];
contact.errors[err.position][type] = crmContacts[err.position];
}
const logObject = {};
for (let e in csvContact) {
logObject[e] = csvContact[e];
}
logObject.error = err.message.replace(/ at position \d/i, '');
logs.push(logObject);
}
isLogExist.importerrors.push(...contact.errors);
}
Error Handling Logic:
- Temporary Field Cleanup: Removes
imported_contact,imported_businessfields - Error Attachment: Attaches failed contact data to error object
- Log Object Creation: Clones original CSV row for context
- Error Message Normalization: Strips position references from error messages
- Error Accumulation: Adds to logs array for final report
- Database Update: Appends errors to
importerrorsarray
Example Error Object:
{
position: 0,
message: 'E11000 duplicate key error: email at position 0',
contact: {
first_name: 'John',
last_name: 'Doe',
email: 'john@example.com',
// ... original contact data
}
}
Resulting Log Entry:
{
'First Name': 'John',
'Last Name': 'Doe',
'Email': 'john@example.com',
error: 'E11000 duplicate key error: email'
}
Updating Existing Logโ
if (isLogExist && isLogExist.import_errors_count >= 0) {
isLogExist.import_errors_count = totalErrors;
}
if (isLogExist && isLogExist.records_added >= 0) {
isLogExist.records_added = totalRecordsAdded;
}
let isUpdated = await isLogExist.save();
logData = isLogExist;
logId = logData._id;
Update Logic:
- Counts: Updates success and error counts (not incrementing, but setting)
- Persistence: Saves to MongoDB
- Return Values: Returns updated log document and ID
- Idempotency: Can be called multiple times safely
Creating New Log Documentโ
let contactImportLog = {
status: 'in_progress',
name: 'csv',
type: type,
records_added: totalRecordsAdded,
import_errors_count: totalErrors,
user_id: new mongoose.Types.ObjectId(owner),
account_id: account_id,
importerrors: contact?.errors ? contact.errors : [],
};
let contactimportsave = await new ContactImportResults(contactImportLog).save();
logData = contactimportsave;
logId = logData._id;
Creation Logic:
- Initial Status: Always
'in_progress'(updated to'completed'elsewhere) - Import Source: Always
'csv'for CSV imports - Type:
'contact','business', or'both' - Error Array: Initializes with errors from first batch
- Ownership: Links to user and account
Socket Notifications (Commented Out)โ
// socketEmit('import_csv_contact', [uid], {
// message: `Success: Successfully added contact with the information: ${JSON.stringify(
// crmContacts[0],
// replacer,
// )}`,
// });
Why Commented:
- Performance: Too many events for large imports
- Noise: Would spam user with hundreds of notifications
- Alternative: Progress bar or batch notifications recommended
Replacer Function:
const replacer = (key, value) => {
if (key === 'businesses' || key === 'provider') return undefined;
else return value;
};
- Filters sensitive/redundant fields from socket messages
- Reduces payload size
- Improves readability
๐ Data Structuresโ
ContactImportResults Documentโ
{
_id: ObjectId,
status: 'in_progress', // 'in_progress', 'completed', 'failed'
name: 'csv', // Import source
type: 'contact', // 'contact', 'business', 'both'
records_added: 150, // Successful inserts
import_errors_count: 5, // Failed inserts
user_id: ObjectId,
account_id: ObjectId,
importerrors: [
{
position: 0,
message: 'Duplicate email error',
contact: { ... } // Failed contact data
},
// ... more errors
],
createdAt: Date,
updatedAt: Date
}
AccountImportResults Documentโ
{
_id: ObjectId,
name: 'csv',
type: 'account', // 'account' or 'team_member'
records_added: 10,
import_errors_count: 2,
account_id: ObjectId,
user_id: ObjectId,
importerrors: [
{
position: 0,
message: 'Email already exists',
account: { ... } // Failed account data
}
],
createdAt: Date,
updatedAt: Date
}
๐จ Usage Patternsโ
Contact Import Job Processorโ
const logData = require('./common/log_data');
// Initialize counters
let totalRecordsAdded = 0;
let totalErrors = 0;
let logId = null;
let logs = [];
// Process each CSV row
for (let contact of csvData) {
// Insert contact
const result = await insertContact(contact);
// Log result
const logResult = await logData(
result, // Insert result
totalRecordsAdded,
totalErrors,
logId,
[contact], // Original contact array
ownerId,
userId,
logs,
csvRow, // Original CSV row
'contact',
accountId,
);
// Update counters for next iteration
totalRecordsAdded = logResult.totalRecordsAdded;
totalErrors = logResult.totalErrors;
logId = logResult.logId;
logs = logResult.logs;
}
// Final status update
await ContactImportResults.updateOne({ _id: logId }, { status: 'completed' });
Account Import Job Processorโ
const { accountLog } = require('./common/log_data');
let totalRecordsAdded = 0;
let totalErrors = 0;
let logId = null;
let logs = [];
for (let accountData of csvData) {
const result = await insertAccount(accountData);
const logResult = await accountLog(
result,
totalRecordsAdded,
totalErrors,
logId,
[accountData],
ownerId,
userId,
logs,
csvRow,
'account',
accountId,
);
totalRecordsAdded = logResult.totalRecordsAdded;
totalErrors = logResult.totalErrors;
logId = logResult.logId;
logs = logResult.logs;
}
๐ Logging Flow Diagramโ
sequenceDiagram
participant JOB as Queue Job
participant LOG as log_data()
participant DB as MongoDB
participant USER as User (via Socket)
JOB->>LOG: Call with result data
LOG->>LOG: Count successes/errors
LOG->>DB: Check if log exists (logId)
alt Log Exists
DB-->>LOG: Existing log document
LOG->>LOG: Process errors
LOG->>LOG: Update counts
LOG->>DB: Save updated log
else Log Doesn't Exist
LOG->>LOG: Create new log document
LOG->>DB: Insert new log
end
DB-->>LOG: Updated/Created log
LOG->>USER: [COMMENTED] Emit progress
LOG-->>JOB: Return updated counters & logId
โ๏ธ Configurationโ
Required Modelsโ
const ContactImportResults = require('../models/contact-import-results');
const AccountImportResults = require('../models/account-import-results');
Socket Event Namesโ
// Contact imports
'import_csv_contact';
// Account imports
'import_csv_account';
๐จ Error Handlingโ
Try-Catch Wrapperโ
try {
// Logging logic
} catch (err) {
logger.error({ initiator: 'QM/log-data', error: err });
return Promise.reject(err);
}
Error Behavior:
- Logs error to application logger
- Rejects promise (fails job)
- Job retries via Bull retry mechanism
- User notified of import failure
Null Safetyโ
if (contact.savedContacts?.length) { ... }
if (contact.errors?.length) { ... }
- Uses optional chaining to prevent null reference errors
- Safely handles missing properties
- Continues processing even if structure is unexpected
๐ Performance Considerationsโ
Database Operationsโ
- Single Query per Call: One
findOneand onesaveper job - Array Growth:
importerrorsarray can grow large (100+ errors) - Index Recommendation: Index on
{ account_id: 1, status: 1 }
Memory Usageโ
- Error Accumulation:
logsarray grows in memory across all jobs - Risk: Large imports (10,000+ rows) with many errors can consume significant memory
- Mitigation: Consider batch flushing logs to database
Optimization Tipsโ
- Batch Logging: Update log document every N jobs instead of every job
- Error Limit: Cap
importerrorsarray at 1000 errors - Summary Only: Store error count, not full error objects
๐งช Testing Considerationsโ
Mock Setupโ
jest.mock('../models/contact-import-results');
jest.mock('../utilities', () => ({
socketEmit: jest.fn(),
}));
const logData = require('./common/log_data');
Test Casesโ
describe('logData', () => {
test('Creates new log on first call', async () => {
const result = await logData(
{ savedContacts: [{}], errors: [] },
0,
0,
null,
[{}],
'user123',
'user123',
[],
{},
'contact',
'account123',
);
expect(result.logId).toBeDefined();
expect(result.totalRecordsAdded).toBe(1);
});
test('Updates existing log', async () => {
const existingLogId = 'log123';
const result = await logData(
{ savedContacts: [], errors: [{ position: 0, message: 'Error' }] },
5,
1,
existingLogId,
[{}],
'user123',
'user123',
[],
{},
'contact',
'account123',
);
expect(result.totalErrors).toBe(2);
expect(result.logId).toBe(existingLogId);
});
});
๐ Related Documentationโ
- Contact Import Processing - Primary consumer
- Account Import - Uses accountLog function
- Common Utilities Overview
- Downgrade Logs - Similar logging pattern
๐ Notesโ
Socket Notificationsโ
The socket notification code is commented out but can be re-enabled for:
- Progress Bars: Batch notifications every 100 records
- Completion Alerts: Single notification on import complete
- Error Summaries: Send error count, not individual errors
Error Position Cleanupโ
err.message.replace(/ at position \d/i, '');
This regex removes " at position 0" from error messages because:
- Position is redundant (stored in
err.position) - Cleaner user-facing error messages
- Consistent formatting across error types
Status Managementโ
The status field in log documents is managed externally:
log_data.jscreates with'in_progress'- Calling service updates to
'completed'or'failed' - Allows for partial success tracking
Complexity: Medium
Business Impact: High - Critical for user feedback and troubleshooting
Dependencies: Mongoose models, socketEmit utility, logger
Last Updated: 2025-10-10