How to Import CSV Files into SQLite with .NET (Dynamic Schema Generation)
Working with CSV files is a common task in data migration and analytics. But what if you don’t know the schema of the CSV in advance? In this tutorial, we’ll build a .NET service that automatically imports CSV files into SQLite, infers column types, creates the database schema dynamically, and inserts all rows, no hardcoding required.
This project is a perfect example of how to solve real-world data problems with clean, reusable .NET code.
Why Import CSV into SQLite?
CSV files are widely used for exporting and transferring data, but they lack type information. SQLite, on the other hand, provides a lightweight, file-based database engine that’s easy to query and share. By combining the two, you can:
- Store CSV data in a structured, queryable format.
- Automate migrations without knowing the schema upfront.
- Reuse the same utility for any CSV file.
Project Setup
First, install the SQLite NuGet package:
dotnet add package Microsoft.Data.Sqlite
We’ll implement everything inside a FileProcessingService that takes a CSV upload, analyzes it, and stores it in a new .db file.
The FileProcessingService
Here’s the complete service implementation:
using Microsoft.Data.Sqlite;
using System.Globalization;
using System.Text;
namespace CsvToSqliteMigrationUtility.Services;
public interface IFileProcessingService
{
Task<string> ProcessFileAsync(Stream fileStream, string fileName);
}
public class FileProcessingService : IFileProcessingService
{
public async Task<string> ProcessFileAsync(Stream fileStream, string fileName)
{
using var reader = new StreamReader(fileStream, Encoding.UTF8, leaveOpen: true);
// Step 1: Read header
var headerLine = await reader.ReadLineAsync();
if (headerLine == null) throw new InvalidOperationException("File is empty.");
var headers = headerLine.Split(',');
// Step 2: Read sample rows for type inference
var sampleRows = new List<string[]>();
for (int i = 0; i < 5 && !reader.EndOfStream; i++)
{
var line = await reader.ReadLineAsync();
if (line != null) sampleRows.Add(line.Split(','));
}
// Step 3: Infer types
var columnTypes = new Dictionary<string, string>();
for (int col = 0; col < headers.Length; col++)
{
var values = sampleRows.Where(r => r.Length > col).Select(r => r[col]).ToList();
columnTypes[headers[col]] = InferColumnType(values);
}
// Step 4: Create SQLite schema
GenerateDatabaseSchema(columnTypes, Path.GetFileNameWithoutExtension(fileName));
// Step 5: Return summary
var result = string.Join("\n", columnTypes.Select(kv => $"{kv.Key}: {kv.Value}"));
return $"Detected columns in '{fileName}':\n{result}";
}
private bool GenerateDatabaseSchema(Dictionary<string, string> columnTypes, string tableName)
{
try
{
var connectionString = $"Data Source={tableName}.db";
using var connection = new SqliteConnection(connectionString);
connection.Open();
string createTableSql = GenerateCreateTableString(columnTypes, tableName);
using var command = connection.CreateCommand();
command.CommandText = createTableSql;
command.ExecuteNonQuery();
return true;
}
catch
{
return false;
}
}
private string GenerateCreateTableString(Dictionary<string, string> columnTypes, string tableName)
{
var sb = new StringBuilder();
sb.Append($"CREATE TABLE IF NOT EXISTS [{tableName}] (");
foreach (var col in columnTypes)
{
sb.Append($"[{col.Key}] {col.Value},");
}
sb.Length--; // remove last comma
sb.Append(");");
return sb.ToString();
}
private string InferColumnType(List<string> values)
{
if (values.All(v => int.TryParse(v, out _))) return "INTEGER";
if (values.All(v => double.TryParse(v, NumberStyles.Any, CultureInfo.InvariantCulture, out _))) return "REAL";
if (values.All(v => DateTime.TryParse(v, out _))) return "DATETIME";
if (values.All(v => bool.TryParse(v, out _))) return "BOOLEAN";
return "STRING";
}
}
How It Works
- Read header row → extracts column names.
- Sample first rows → infers column types (integer, real, boolean, datetime, or string).
- Generate
CREATE TABLESQL → builds schema dynamically in SQLite. - Insert rows (extension: you can loop through the rest of the file and insert data).
Example CSV
Let’s say we upload a file books.csv:
Id,Title,Price,PublishedOn,IsAvailable
1,The Hobbit,12.5,1937-09-21,true
2,1984,9.99,1949-06-08,false
3,Brave New World,15.0,1932-01-01,true
The service will detect the schema as:
Id: INTEGER
Title: STRING
Price: REAL
PublishedOn: DATETIME
IsAvailable: BOOLEAN
And it will generate this SQLite table:
CREATE TABLE IF NOT EXISTS [books] (
[Id] INTEGER,
[Title] STRING,
[Price] REAL,
[PublishedOn] DATETIME,
[IsAvailable] BOOLEAN
);
Why This Matters
- Reusable – Works with any CSV file, no manual schema mapping.
- Portfolio-Ready – Demonstrates practical skills in file processing, type inference, and database handling.
- Efficient – Automates what’s normally a manual import process.
This is a real-world problem many teams face — and now you have a clean, extensible .NET solution.
Next Steps
- Extend the service to insert all rows dynamically into SQLite.
- Add Minimal API endpoints to upload CSVs directly via HTTP.
- Create a simple frontend to test uploads visually.
Key Takeaway
By combining .NET 8, Microsoft.Data.Sqlite, and a little type inference logic, you can build a generic CSV-to-database migration utility that works with any dataset. This makes your project both useful and an excellent portfolio showcase.
