-
Quick Script: Convert a CSV file to a SQLite database file
Updated on October 4, 2025 to detect INTEGER/REAL/TEXT column types.
Here’s a quick bash script to convert a .csv file into a .sqlite file so you can query your CSV data in tools like TablePlus.
The file size is only slightly larger than the original CSV, and you can share it (it’s just a file!).
Just create the file csv-to-sqlite somewhere in your PATH and chmod +x csv-to-sqlite to make it executable.
123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147148149150151152153154155156157158159160161162163164165166167168169170171172173174175176177178179180181182183184185186187188189190191192193194195196197198199200201202203204205206207208209210211#!/bin/bash# csv-to-sqlite - Convert CSV file to SQLite database with type detection# Usage: csv-to-sqlite [options] <csvfile>set -e# Default valuestable_name="main"show_help=false# Function to show usageshow_usage() {cat << EOFUsage: $0 [OPTIONS] <csvfile>Convert a CSV file to an SQLite database with automatic type detection.Arguments:csvfile Path to the CSV file to importOptions:-t, --table-name NAME Name for the table (default: main)-h, --help Show this help messageExamples:$0 ~/mycsvfile.csv$0 --table-name employees ~/data/employees.csv$0 -t products ~/products.csvOutput:Creates a SQLite database with the same name as the CSV file but with.sqlite extension in the same directory as the input file.Column types are automatically detected:- INTEGER for whole numbers- REAL for decimal numbers- TEXT for everything elseEOF}# Parse command line argumentswhile [[ $# -gt 0 ]]; docase $1 in-t|--table-name)table_name="$2"shift 2;;-h|--help)show_usageexit 0;;-*)echo "Error: Unknown option $1" >&2show_usageexit 1;;*)# This should be the CSV fileif [[ -z "${csv_file:-}" ]]; thencsv_file="$1"elseecho "Error: Multiple CSV files specified" >&2show_usageexit 1fishift;;esacdone# Check if CSV file was providedif [[ -z "${csv_file:-}" ]]; thenecho "Error: CSV file not specified" >&2show_usageexit 1fi# Check if CSV file existsif [[ ! -f "$csv_file" ]]; thenecho "Error: File '$csv_file' not found" >&2exit 1fi# Expand tilde in pathcsv_file=$(eval echo "$csv_file")# Generate SQLite database filename# Remove extension and add .sqlitebase_db_file="${csv_file%.*}.sqlite"db_file="$base_db_file"# If file exists, add timestamp suffixif [[ -f "$db_file" ]]; thentimestamp=$(date +"%Y%m%d%H%M%S")# Extract directory, basename without extension, and add timestampdb_dir=$(dirname "$base_db_file")db_name=$(basename "$base_db_file" .sqlite)db_file="${db_dir}/${db_name}_${timestamp}.sqlite"echo "Warning: '$base_db_file' already exists, creating '$db_file' instead"fi# Validate table name (basic check for SQL injection prevention)if [[ ! "$table_name" =~ ^[a-zA-Z_][a-zA-Z0-9_]*$ ]]; thenecho "Error: Invalid table name '$table_name'. Table names must start with a letter or underscore and contain only letters, numbers, and underscores." >&2exit 1fiecho "Converting '$csv_file' to SQLite database '$db_file'"echo "Table name: $table_name"# Detect column types using awk# This analyzes the first 1000 rows to determine INTEGER, REAL, or TEXT typescreate_table_sql=$(awk -F',' 'BEGIN {SAMPLE_SIZE = 1000}# Process header rowNR == 1 {for (i = 1; i <= NF; i++) {# Remove quotes and whitespace from headergsub(/^[ \t"]+|[ \t"]+$/, "", $i)# Use column number if header is emptyif ($i == "" || length($i) == 0) {headers[i] = "col" i} else {headers[i] = $i}# Initialize all columns as INTEGERtypes[i] = "INTEGER"}next}# Process data rows (sample only)NR <= SAMPLE_SIZE + 1 {for (i = 1; i <= NF; i++) {value = $i# Remove quotes and whitespacegsub(/^[ \t"]+|[ \t"]+$/, "", value)# Skip empty valuesif (value == "") continue# If already TEXT, skip further checksif (types[i] == "TEXT") continue# Check if value is an integerif (types[i] == "INTEGER") {if (value !~ /^-?[0-9]+$/) {# Not an integer, check if its a real numberif (value ~ /^-?[0-9]*\.?[0-9]+([eE][-+]?[0-9]+)?$/) {types[i] = "REAL"} else {types[i] = "TEXT"}}}# Check if value is a real numberelse if (types[i] == "REAL") {if (value !~ /^-?[0-9]*\.?[0-9]+([eE][-+]?[0-9]+)?$/) {types[i] = "TEXT"}}}}END {# Generate CREATE TABLE statementprint "CREATE TABLE IF NOT EXISTS temp_table ("for (i = 1; i <= length(headers); i++) {# Escape double quotes in column namessafe_name = headers[i]gsub(/"/, "\"\"", safe_name)printf " \"%s\" %s", safe_name, types[i]if (i < length(headers)) {print ","} else {print ""}}print ");"}' "$csv_file")# Import CSV to SQLite with proper schemasqlite3 "$db_file" << EOF-- Create temporary table with detected types$create_table_sql-- Import CSV data.mode csv.import "$csv_file" temp_table-- Create final table with correct nameCREATE TABLE "$table_name" AS SELECT * FROM temp_table;-- Drop temporary tableDROP TABLE temp_table;EOFecho "Successfully created SQLite database: $db_file"echo "Table '$table_name' contains $(sqlite3 "$db_file" "SELECT COUNT(*) FROM $table_name;") rows"echo ""# Show schemaecho "Table schema:"sqlite3 "$db_file" "PRAGMA table_info($table_name);" | awk -F'|' '{printf " %-20s %s\n", $2, $3}'echo ""echo "You can now open this SQLite database in applications like TablePlus for convenient data viewing and manipulation."As a side note, you can use something like dsq to query a csv file. I’ve stopped using that for this particular task.
-
Grouping Focus Bracketed Photos from an OM-1 Mark 2
The Problem
I have a folder of image (JPG and RAW) files of some focus bracketed shots from my OM-1 Mark 2. I need to find a way to group them… maybe there is exif data that’d help? Or maybe I can use bash to group by timestamp to find all shots within 1 second of the previous shot… if the gap is more than 1 second then it starts a new “series”.
That was my prompt for Claude.
Claude suggested I try this first, to see what’s available:
12345678# Install exiftool if you don't have it# brew install exiftool # macOS# sudo apt install libimage-exiftool-perl # Ubuntu/Debian# Check what EXIF data is availableexiftool P7120919.JPG | grep -i bracketexiftool P7120919.JPG | grep -i focusexiftool P7120919.JPG | grep -i sequenceI ran the commands and replied:
Ooh good news. There appears to be some metadata in both JPG and ORF files.
Focus Bracket Step Size : 5
White Balance Bracket : 0 0
Drive Mode : Focus Bracketing, Shot 7; Electronic shutterClaude then gave me a bash script. I made a few tweaks, re-prompted, made some tweaks, then it was done.
It turns this flat folder of files:
12345678910111213141516P7120857.JPGP7120857.ORFP7120858.JPGP7120858.ORFP7120859.JPGP7120859.ORFP7120860.JPGP7120860.ORFP7120861.JPGP7120861.ORFP7120863.JPGP7120863.ORFP7120864.JPGP7120864.ORFP7120865.JPGP7120865.ORFInto this:
123456789101112131415161718├── focus_bracket_series_1│ ├── P7120857.JPG│ ├── P7120857.ORF│ ├── P7120858.JPG│ ├── P7120858.ORF│ ├── P7120859.JPG│ ├── P7120859.ORF│ ├── P7120860.JPG│ ├── P7120860.ORF│ ├── P7120861.JPG│ └── P7120861.ORF├── focus_bracket_series_2│ ├── P7120863.JPG│ ├── P7120863.ORF│ ├── P7120864.JPG│ ├── P7120864.ORF│ ├── P7120865.JPG│ ├── P7120865.ORFImplementing
- Write this script somewhere (e.g., ~/scripts/om1-focus-bracket-grouper.sh )
- Make it executable ( chmod +x om1-focus-bracket-grouper.sh )
- Go into the flat folder (e.g., cd ~/Pictures/MacroShots )
- Run it ( ~/scripts/om1-focus-bracket-grouper.sh )
- It will show you what the results will be and then prompt you to hit Y to move the files or N to abort.
12345678910111213141516171819202122232425262728293031323334353637383940414243444546474849505152535455565758596061626364656667686970717273747576777879808182838485868788899091929394959697#!/bin/bash# Analyze focus bracket sequences (dry run)analyze_focus_brackets() {echo "=== Focus Bracket Analysis ==="echolocal series_num=1local prev_shot_num=999local series_files=()# Sort by filename (chronological order) instead of modification timefor file in $(ls P*.ORF P*.JPG 2>/dev/null | sort); do[[ -f "$file" ]] || continue# Get both drive mode and timestamp for verificationdrive_mode=$(exiftool -DriveMode -T "$file" 2>/dev/null)timestamp=$(exiftool -DateTimeOriginal -d "%Y-%m-%d %H:%M:%S" -T "$file" 2>/dev/null)echo "Processing: $file - Drive Mode: '$drive_mode'"if [[ "$drive_mode" =~ "Focus Bracketing, Shot "([0-9]+) ]]; thenshot_num="${BASH_REMATCH[1]}"echo " Found shot number: $shot_num"# If shot number reset (went backwards), we found a new seriesif [[ $shot_num -lt $prev_shot_num ]]; thenif [[ ${#series_files[@]} -gt 0 ]]; thenecho "Series $((series_num-1)): ${#series_files[@]} shots"printf " %s\n" "${series_files[@]}"echofiseries_files=()echo "--- Starting Series $series_num ---"((series_num++))fiseries_files+=("$file (Shot $shot_num, $timestamp)")prev_shot_num=$shot_numelseecho " Not a focus bracket shot, skipping"fidone# Print final seriesif [[ ${#series_files[@]} -gt 0 ]]; thenecho "Series $((series_num-1)): ${#series_files[@]} shots"printf " %s\n" "${series_files[@]}"fiechoecho "Total series found: $((series_num-1))"}# Function to actually move files after confirmationmove_focus_brackets() {echo "Moving files to series folders..."local series_num=1local prev_shot_num=999for file in $(ls P*.ORF P*.JPG 2>/dev/null | sort); do[[ -f "$file" ]] || continuedrive_mode=$(exiftool -DriveMode -T "$file" 2>/dev/null)if [[ "$drive_mode" =~ "Focus Bracketing, Shot "([0-9]+) ]]; thenshot_num="${BASH_REMATCH[1]}"if [[ $shot_num -lt $prev_shot_num ]]; then((series_num++))fiseries_dir="focus_bracket_series_$((series_num-1))"mkdir -p "$series_dir"echo "Moving $file to $series_dir"mv "$file" "$series_dir/"prev_shot_num=$shot_numfidone}# Run analysis firstanalyze_focus_bracketsechoread -p "Proceed with moving files? (y/N): " -n 1 -rechoif [[ $REPLY =~ ^[Yy]$ ]]; thenmove_focus_bracketsecho "Done!"elseecho "Aborted. Files not moved."fi -
Using a Game Controller to Speed Up Tedious Mac Tasks
FULL DISCLOSURE: I personally prefer using Keyboard Maestro over Hammerspoon for handling the “actions” in this tutorial. That said, the Hammerspoon approach seemed worth mentioning, even if I don’t have much confidence in my Lua code — so here it is, just in case it’s useful!
If you spend a lot of time performing repetitive tasks on your Mac, using a game controller as an input device can make things much more efficient—and more comfortable. Whether you’re organizing files, editing video, or culling photos, a controller lets you execute common actions with minimal effort.
In this post, I’ll walk through how I use a game controller to speed up photo culling—the process of selecting the best images from a shoot. While this tutorial focuses on photo review, the same approach can be applied to many other workflows.
Culling is an essential but tedious part of a photographer’s workflow. I’ve spent countless hours sorting through tens of thousands of images in ExcireFoto 2025 and Adobe Lightroom, and I quickly realized that using just a keyboard and trackpad was slowing me down. That’s when I started experimenting with using a USB/Bluetooth game controller for culling. Instead of hunching over a keyboard, I can sit back and control everything with a lightweight, ergonomic device in my hands.
Here’s why it works so well:
- Ergonomic – Game controllers are made to fit your hands. It’s natural to hold them for hours at a time.
-
Quicker actions with muscle memory – Pressing controller buttons becomes second nature, making repetitive tasks much faster.
-
Keeps your eyes on the screen – No need to glance at your keyboard; your fingers always know where to go.
-
Better posture and comfort – You can position your monitor optimally without worrying about your keyboard.
-
Simple to program – Setting up button mappings takes only minutes.
-
Free software – You don’t need expensive tools to get this working.
-
Gamifies tedious work – Making a dull task feel more interactive can increase efficiency.
-
More control over navigation – Assign multiple actions to different buttons or combinations.
-
Works for more than just culling – This method applies to video editing, file organization, and more.
I’ve done various iterations of this over time. Today I realized I’ve never blogged about it. So, here goes!
-
Double-tap Modifier Hotkeys in Any Application
PHPStorm has this great feature where you can double-tap the shift key to open the “Search Everywhere” utility (or at least that’s how I’ve configured my PHPStorm). It’s wonderful and I use it all day. I wanted the same behavior in some of my other applications but quickly realized it’s not achievable out-of-the-box, even with Keyboard Maestro. I tried to use the double-tap usb device key as a trigger but it didn’t work. I ended up solving the problem using Karabiner Elements.
The idea is simple: Use Karabiner Elements to turn a double-tapped left shift into a hotkey I’d not realistically have set up in any application, then use that hotkey as the hotkey in Keyboard Maestro. I chose <cmd-shift-opt-ctrl-f>.
First, here is the code, which you can create as a Complex Modification in Karabiner Elements:
-
Laravel + Filament + Private Disk on Cloudways
Introduction
After a long, frustrating debugging exercise, I have discovered a quick tip for my Cloudways + Laravel friends. The “why” is still a touch fuzzy for me at the moment, but I’ll at least explain the symptoms and the fix.
The goal: Add a private file upload to a Filament resource form, ensuring anonymous users cannot download/view the file.
The issue: Everything was working great in my ddev environment. I could preview and open the files if I was logged in, and I would see a 403 if I was logged out. When I moved this to a Cloudways server I was getting automatically logged-out every time I hit a private file URL. Weird, right?
Code
Here’s the final code (the Cloudways “fix” is further down this page):
-
Writing Tests for Drush Commands
There are plenty of examples of these in the wild, but I figured I’d show a stripped down version of an automated Kernel test that successfully tests a drush command. The trick here is making sure you set up a logger and that you stub a few methods (if you happen to use $this->logger() and dt() in your Drush commands). Also featured in this example is the use of Faker to generate realistic test data.
123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147148149150<?phpdeclare(strict_types = 1);namespace Drupal\mymodule\Tests\Kernel {use Drupal\KernelTests\KernelTestBase;use Drupal\mymodule\Drush\Commands\QueueCommands;use Drush\Log\DrushLoggerManager;use Faker\Factory;use Faker\Generator;use Psr\Log\LoggerInterface;/*** Tests the QueueCommands Drush command file.*/final class QueueTest extends KernelTestBase {/*** {@inheritdoc}*/protected static $modules = [// Enable all the modules that this test relies on.'auto_entitylabel','block','datetime','eck','field','file','image','link','mymodule','mysql','options','path','path_alias','system','text','user','views','views_bulk_operations',];/*** The Faker generator.** @var \Faker\Generator*/protected Generator $faker;/*** The logger.** @var \Drush\Log\DrushLoggerManager|\Psr\Log\LoggerInterface*/protected DrushLoggerManager|LoggerInterface $logger;/*** The Drush queue commands.** @var \Drupal\mymodule\Drush\Commands\QueueCommands*/protected QueueCommands $drushQueueCommands;/*** Set up the test environment.*/protected function setUp(): void {parent::setUp();// Install required config.$this->installConfig(['mymodule']);$this->faker = Factory::create();$logger_class = class_exists(DrushLoggerManager::class) ? DrushLoggerManager::class : LoggerInterface::class;$this->logger = $this->prophesize($logger_class)->reveal();$drushQueueCommands = new QueueCommands();$drushQueueCommands->setLogger($this->logger);}/*** Tests the mymodule:close-all-registration-entities drush command.*/public function testCloseAllRegistrationEntities() {// Create a single registration entity.// Just demoing simple Faker usage for blog post.$registration = \Drupal::entityTypeManager()->getStorage('registration')->create(['type' => 'appt','field_echeckin_link' => $this->faker->url(),'field_epic_id' => $this->faker->randomNumber(9),'field_first_name' => $this->faker->firstName(),'field_has_echeckin_link' => $this->faker->boolean(),'field_outcome' => NULL,'field_phone_number' => $this->faker->phoneNumber(),'field_sequence_is_open' => 1,]);$registration->save();// This method exists in my module's Drush command file.// It sets field_sequence_is_open field to 0 for all open reg entities.// drush mymodule:close-all-registration-entities.$this->drushQueueCommands->closeAllRegistrationEntities();// Assert that the registration entity was updated.$this->assertCount(1, \Drupal::entityTypeManager()->getStorage('registration')->loadByProperties(['field_sequence_is_open' => 0]));}}}namespace {if (!function_exists('dt')) {/*** Stub for dt().** @param string $message* The text.* @param array $replace* The replacement values.** The text.*/function dt($message, array $replace = []): string {return strtr($message, $replace);}}if (!function_exists('drush_op')) {/*** Stub for drush_op.** @param callable $callable* The function to call.*/function drush_op(callable $callable) {$args = func_get_args();array_shift($args);return call_user_func_array($callable, $args);}}}I learned this via the migrate_tools project here.
-
Using Data Providers in PHPUnit Tests
This is the before code, where a single test is run, and each scenario I’m testing could be influenced by the previous scenario (not a good thing, unless that was my goal, which it was not).
123456789101112131415161718192021/*** Tests the getRegistrationsWithinNumDays function.*/public function testGetRegistrationsWithinNumDays() {$firstRegDateTimeStr = '2024-03-18 08:00:00';$this->mhpreregTestHelpers->createRegistration(['field_reg_datetime' => '2024-03-18 09:00:00']);$this->mhpreregTestHelpers->createRegistration(['field_reg_datetime' => '2024-03-19 13:00:00']);$this->mhpreregTestHelpers->createRegistration(['field_reg_datetime' => '2024-03-20 14:20:00']);$this->mhpreregTestHelpers->createRegistration(['field_reg_datetime' => '2024-03-20 17:35:00']);$this->mhpreregTestHelpers->createRegistration(['field_reg_datetime' => '2024-03-21 17:35:00']);$this->mhpreregTestHelpers->createRegistration(['field_reg_datetime' => '2024-03-22 08:00:00']);$registrations = Utils::getIdsOfRegistrationsWithinNumDays(1, $firstRegDateTimeStr);$this->assertEquals(2, count($registrations));$registrations = Utils::getIdsOfRegistrationsWithinNumDays(2, $firstRegDateTimeStr);$this->assertEquals(3, count($registrations));$registrations = Utils::getIdsOfRegistrationsWithinNumDays(3, $firstRegDateTimeStr);$this->assertEquals(5, count($registrations));}This is the after code, where three tests are run. Unfortunately this takes 3x longer to execute. The upside is that each scenario cannot affect the others and it’s perhaps more readable and easier to add additional scenarios.
123456789101112131415161718192021222324252627/*** Data provider for testGetRegistrationsWithinNumDays.*/public function registrationsDataProvider() {return [// Label => [numDays, expectedCount].'1 day' => [1, 2],'2 days' => [2, 3],'3 days' => [3, 5],];}/*** Tests the getRegistrationsWithinNumDays function using a data provider.** @dataProvider registrationsDataProvider*/public function testGetRegistrationsWithinNumDays($numDays, $expectedCount) {$this->mhpreregTestHelpers->createRegistration(['field_reg_datetime' => '2024-03-18 08:00:00']);$this->mhpreregTestHelpers->createRegistration(['field_reg_datetime' => '2024-03-19 09:00:00']);$this->mhpreregTestHelpers->createRegistration(['field_reg_datetime' => '2024-03-20 13:00:00']);$this->mhpreregTestHelpers->createRegistration(['field_reg_datetime' => '2024-03-21 14:20:00']);$this->mhpreregTestHelpers->createRegistration(['field_reg_datetime' => '2024-03-21 17:35:00']);$this->mhpreregTestHelpers->createRegistration(['field_reg_datetime' => '2024-03-22 17:35:00']);$registrations = Utils::getIdsOfRegistrationsWithinNumDays($numDays, '2024-03-18 08:00:00');$this->assertEquals($expectedCount, count($registrations), "Failed for {$numDays} days.");} -
Skip Empty Values During Concatenation in a Drupal 10 Migration
UPDATE. You can do the same as below without the need for a custom module function. (source)
123456789field_address_full:- plugin: callbackcallable: array_filtersource:- address_1- city- state- plugin: concatdelimiter: ', '
This quick example shows one method to ignore empty values in a Drupal 10 migration concatenate process.
Code
Migrate code:
12345678910111213141516process:temp_address_parts:-plugin: skip_on_emptymethod: processsource:- address_1- city- state-plugin: callbackcallable: mymodule_migrate_filter_emptyfield_address_full:plugin: concatdelimiter: ', 'source: '@temp_address_parts'Module code:
12345678/*** Filters out empty values from an array.*/function mymodule_migrate_filter_empty($values): array {return array_filter($values, function ($value) {return !empty($value);});}Result
Example data:
1234address_1,city,state1 Main St.,Portland,ME2 Main St.,Portland,Portland,MEResulting field_address_full plain text value
1231 Main St., Portland, ME1 Main St., PortlandPortland, ME -
Using lnav to View Live DDEV Logs
https://lnav.org/ is a fantastic tool to work with log files. The SQL query functionality is a lot like https://github.com/multiprocessio/dsq, which also impresses.
My quick tip is that lnav can read from stdin. This means you can pipe content into it. Here’s how I monitor my ddev environment in realtime. Hooray for colors!
1ddev logs -f | lnav -qIf you haven’t explored lnav you’ll want to dive into the documentation a bit. It does a lot more than just give pretty colors to your log files / output. https://docs.lnav.org/en/v0.11.2/hotkeys.html is certainly worth bookmarking.
-
Rendering null boolean values with Grid4PHP
I’m a big fan of Grid4PHP for adding a quick CRUD user interface to a MySQL database.
I have a lot of boolean columns, like “is_telehealth” that need to start as NULL until we make a determination about the field’s value. Grid4PHP renders null booleans as “0” in the table/edit form. This is misleading, given it’s not actually a 0 value. If you add the ‘isnull’ attribute to the column it lets you save the null value, but it still renders as a “0”.
Using a custom formatter I was able to render an empty string (nothing) instead of “0” for these null booleans:
1234$boolNullFormatter = "function(cellval,options,rowdata){ return cellval === null ? '' : cellval; }";$cols[] = ['name' => 'is_in_gmb', 'isnull' => true, 'formatter' => $boolNullFormatter];$cols[] = ['name' => 'is_property', 'isnull' => true, 'formatter' => $boolNullFormatter];$cols[] = ['name' => 'is_telehealth', 'isnull' => true, 'formatter' => $boolNullFormatter];