Batch Add People

Overview

When importing large numbers of cardholders, adding people one at a time is inefficient. The batch add operation allows you to create multiple person records in a single API call, dramatically improving performance for bulk imports from HR systems, CSV files, or other data sources.


When to Use Batch Operations

Scenario Method Recommendation
1-10 people Single add Use AddPersonAsync individually
10-100 people Batch add Use AddPeopleBatchAsync
100-1000 people Chunked batch Split into 100-person batches
1000+ people Chunked batch with retry Split batches with error handling

Basic Batch Add

Add multiple people in a single API call:

var count = await client.AddPeopleBatchAsync(currentInstance, new PersonInfo[]
{
    new PersonInfo
    {
        CommonName = "JaneDoe",
        GivenName = "Jane",
        Surname = "Doe",
        Addresses = new AddressInfo[]
        {
            new PhoneInfo { Number = "847-691-0602", Type = "Work" },
            new EmailAddressInfo { MailTo = "jane.doe@company.com", Type = "Work" }
        }
    },
    new PersonInfo
    {
        CommonName = "BobSmith",
        GivenName = "Bob",
        Surname = "Smith",
        Addresses = new AddressInfo[]
        {
            new PhoneInfo { Number = "847-555-1234", Type = "Work" },
            new EmailAddressInfo { MailTo = "bob.smith@company.com", Type = "Work" }
        }
    },
    new PersonInfo
    {
        CommonName = "JillSmith",
        GivenName = "Jill",
        Surname = "Smith",
        Addresses = new AddressInfo[]
        {
            new PhoneInfo { Number = "847-555-5678", Type = "Work" },
            new EmailAddressInfo { MailTo = "jill.smith@company.com", Type = "Work" }
        }
    }
});

Console.WriteLine($"Successfully added {count} people");

Import from CSV File

CSV File Format

first_name,last_name,email,phone,department,employee_id
John,Doe,john.doe@company.com,555-0101,Engineering,EMP001
Jane,Smith,jane.smith@company.com,555-0102,Marketing,EMP002
Bob,Johnson,bob.johnson@company.com,555-0103,Sales,EMP003

C# CSV Import Implementation

using CsvHelper;
using System.Globalization;

// Define class matching CSV columns
public class EmployeeRecord
{
    public string first_name { get; set; }
    public string last_name { get; set; }
    public string email { get; set; }
    public string phone { get; set; }
    public string department { get; set; }
    public string employee_id { get; set; }
}

// Import people from CSV
public async Task<int> ImportFromCsvAsync(KeepClient client, InstanceInfo instance, string filePath)
{
    using var reader = new StreamReader(filePath);
    using var csv = new CsvReader(reader, CultureInfo.InvariantCulture);
    
    var records = csv.GetRecords<EmployeeRecord>().ToList();
    
    // Convert CSV records to PersonInfo objects
    var people = records.Select(r => new PersonInfo
    {
        CommonName = $"{r.first_name}{r.last_name}",  // Unique identifier
        GivenName = r.first_name,
        Surname = r.last_name,
        Addresses = new AddressInfo[]
        {
            new EmailAddressInfo { Type = "Work", MailTo = r.email },
            new PhoneInfo { Type = "Work", Number = r.phone }
        },
        // Tags must be lowercase with no spaces
        Tags = new string[] { r.department.ToLower().Replace(" ", "-") },
        Monikers = new MonikerItem[]
        {
            new MonikerItem 
            { 
                Namespace = "HR_System", 
                Nickname = r.employee_id 
            }
        }
    }).ToArray();
    
    return await client.AddPeopleBatchAsync(instance, people);
}

Chunked Batch Processing

For large imports, split into manageable chunks with progress tracking:

public async Task<(int success, int failed)> ImportLargeBatchAsync(
    KeepClient client, 
    InstanceInfo instance, 
    PersonInfo[] allPeople,
    int batchSize = 100,
    IProgress<int> progress = null)
{
    int totalSuccess = 0;
    int totalFailed = 0;
    
    // Split into chunks
    var batches = allPeople
        .Select((person, index) => new { person, index })
        .GroupBy(x => x.index / batchSize)
        .Select(g => g.Select(x => x.person).ToArray())
        .ToList();
    
    Console.WriteLine($"Processing {allPeople.Length} people in {batches.Count} batches...");
    
    for (int i = 0; i < batches.Count; i++)
    {
        try
        {
            var batch = batches[i];
            var count = await client.AddPeopleBatchAsync(instance, batch);
            totalSuccess += count;
            
            Console.WriteLine($"Batch {i + 1}/{batches.Count}: Added {count} people");
            progress?.Report(totalSuccess);
        }
        catch (Exception ex)
        {
            totalFailed += batches[i].Length;
            Console.WriteLine($"Batch {i + 1} failed: {ex.Message}");
            
            // Optionally save failed batch for retry
            await SaveFailedBatchAsync(batches[i], i, ex.Message);
        }
        
        // Optional: Add delay between batches to avoid rate limiting
        await Task.Delay(100);
    }
    
    return (totalSuccess, totalFailed);
}

private async Task SaveFailedBatchAsync(PersonInfo[] batch, int batchNumber, string error)
{
    var failedFile = $"failed_batch_{batchNumber}_{DateTime.Now:yyyyMMddHHmmss}.json";
    var json = JsonConvert.SerializeObject(batch, Formatting.Indented);
    await File.WriteAllTextAsync(failedFile, json);
}

Batch with Access Levels and Badge Types

Include access permissions in the batch:

// Pre-fetch access level and badge type
var accessLevel = (await client.SearchAsync(currentInstance, "{\"CommonName\":\"Standard Employee Access\"}"))
    .OfType<AccessLevelInfo>().FirstOrDefault();
var badgeType = (await client.SearchAsync(currentInstance, "{\"CommonName\":\"Full-Time Employee\"}"))
    .OfType<BadgeTypeInfo>().FirstOrDefault();

// Create people with full permissions
var people = records.Select(r => new PersonInfo
{
    CommonName = $"{r.first_name}{r.last_name}",
    GivenName = r.first_name,
    Surname = r.last_name,
    ObjectLinks = new ObjectLinkItem[]
    {
        // Link to access level
        new ObjectLinkItem
        {
            CommonName = accessLevel.CommonName,
            Href = accessLevel.Href,
            LinkedObjectKey = accessLevel.Key,
            Relation = "AccessLevel"
        },
        // Link to badge type
        new ObjectLinkItem
        {
            CommonName = badgeType.CommonName,
            Href = badgeType.Href,
            LinkedObjectKey = badgeType.Key,
            Relation = "BadgeType"
        }
    },
    CardAssignments = new CardAssignmentInfo[]
    {
        new CardAssignmentInfo
        {
            DisplayCardNumber = r.card_number,
            EncodedCardNumber = long.Parse(r.card_number),
            ActiveOn = DateTime.UtcNow,
            ExpiresOn = DateTime.UtcNow.AddYears(1)
        }
    }
}).ToArray();

await client.AddPeopleBatchAsync(currentInstance, people);

Validation Before Import

Validate data before sending to API:

public class ImportValidator
{
    public List<(PersonInfo Person, string Error)> ValidateRecords(PersonInfo[] people)
    {
        var errors = new List<(PersonInfo, string)>();
        var seenNames = new HashSet<string>();
        
        foreach (var person in people)
        {
            // Check required fields
            if (string.IsNullOrWhiteSpace(person.GivenName))
                errors.Add((person, "GivenName is required"));
            
            if (string.IsNullOrWhiteSpace(person.Surname))
                errors.Add((person, "Surname is required"));
            
            // Check for duplicate CommonNames
            if (seenNames.Contains(person.CommonName))
                errors.Add((person, $"Duplicate CommonName: {person.CommonName}"));
            else
                seenNames.Add(person.CommonName);
            
            // Validate email format
            var emails = person.Addresses?.OfType<EmailAddressInfo>();
            if (emails != null)
            {
                foreach (var email in emails)
                {
                    if (!IsValidEmail(email.MailTo))
                        errors.Add((person, $"Invalid email: {email.MailTo}"));
                }
            }
            
            // Validate card numbers
            if (person.CardAssignments != null)
            {
                foreach (var card in person.CardAssignments)
                {
                    if (card.EncodedCardNumber <= 0)
                        errors.Add((person, $"Invalid card number: {card.DisplayCardNumber}"));
                }
            }
        }
        
        return errors;
    }
    
    private bool IsValidEmail(string email)
    {
        try { var addr = new System.Net.Mail.MailAddress(email); return addr.Address == email; }
        catch { return false; }
    }
}

// Usage
var validator = new ImportValidator();
var errors = validator.ValidateRecords(people);

if (errors.Any())
{
    Console.WriteLine($"Found {errors.Count} validation errors:");
    foreach (var (person, error) in errors)
        Console.WriteLine($"  {person.CommonName}: {error}");
    return;
}

await client.AddPeopleBatchAsync(currentInstance, people);

cURL Example

curl -X POST \
  "https://api.us.acresecurity.cloud/api/f/INSTANCE_KEY/people/complex" \
  -H "Authorization: Bearer YOUR_ACCESS_TOKEN" \
  -H "Content-Type: application/json" \
  -d '[
    {
      "$type": "Feenics.Keep.WebApi.Model.PersonInfo, Feenics.Keep.WebApi.Model",
      "CommonName": "JohnDoe",
      "GivenName": "John",
      "Surname": "Doe",
      "Addresses": [
        {
          "$type": "Feenics.Keep.WebApi.Model.EmailAddressInfo, Feenics.Keep.WebApi.Model",
          "MailTo": "john.doe@company.com",
          "Type": "Work"
        }
      ]
    },
    {
      "$type": "Feenics.Keep.WebApi.Model.PersonInfo, Feenics.Keep.WebApi.Model",
      "CommonName": "JaneSmith",
      "GivenName": "Jane",
      "Surname": "Smith",
      "Addresses": [
        {
          "$type": "Feenics.Keep.WebApi.Model.EmailAddressInfo, Feenics.Keep.WebApi.Model",
          "MailTo": "jane.smith@company.com",
          "Type": "Work"
        }
      ]
    }
  ]'

Performance Guidelines

Batch Size Approximate Time Recommendation
10 people ~1 second Good for testing
100 people ~5-10 seconds Optimal for most imports
500 people ~30-60 seconds Add progress indicators
1000+ people Varies Split into 100-person batches

Best Practices

Practice Recommendation
Batch Size Use 100 people per batch for optimal performance
Validation Validate all records before sending to API
Error Handling Save failed batches for retry rather than failing entire import
Unique Names Ensure CommonName is unique across all records
Progress Tracking Show progress for imports over 100 records
Idempotency Use Monikers to track records for update/skip logic on re-import
Testing Test with small batch first before full import

Troubleshooting

Issue Cause Solution
Partial batch failure One invalid record Validate before import, use chunking
Timeout errors Batch too large Reduce batch size to 50-100
Duplicate errors CommonName exists Pre-check existing names or update instead
Memory issues Loading entire CSV Use streaming CSV parser
Rate limiting Too many requests Add delays between batches