AI

Integrating Google CLI + LLMs: Automating My Alaskan Homesteading Logs

Most people hear "homesteading in Alaska" and picture some guy with an axe and a flannel shirt who has never touched a computer. And honestly, some days...

Most people hear "homesteading in Alaska" and picture some guy with an axe and a flannel shirt who has never touched a computer. And honestly, some days that's me. But the other half of my life involves writing code, running cloud infrastructure, and building AI-powered systems for a living. The collision of those two worlds has produced something I never expected: an automated homesteading log system powered by Google Cloud CLI tools and large language models.

It started because I was tired of losing track of things. When did I last service the generator? How much propane did I burn through last February? What was the weather pattern the week before the pipes almost froze? I had notebooks. I had sticky notes on the fridge. I had a spreadsheet I updated for about three weeks before abandoning it entirely.

Digital Transformation at Machine Speed: How CIOs Are Using AI to Accelerate Change Without Losing Control

Digital Transformation at Machine Speed: How CIOs Are Using AI to Accelerate Change Without Losing Control

Enterprise transformation fails 70% of the time—and AI makes it harder. Learn to use agentic AI to accelerate transformation itself, not just as the end goal.

Learn More

So I did what any engineer living in a cabin in Caswell Lakes would do: I over-engineered it.


The Problem With Homesteading Record-Keeping

If you've never managed a remote property, let me paint the picture. At any given time, I'm tracking:

  • Weather conditions — temperature highs/lows, wind, snowfall, freeze/thaw cycles
  • Generator and power systems — run hours, fuel consumption, maintenance intervals
  • Propane and heating — tank levels, delivery schedules, burn rate by month
  • Water system — well pump status, filter changes, freeze prevention measures
  • Structural maintenance — roof snow load, deck repairs, insulation checks
  • Supply inventory — food stores, emergency supplies, hardware stock

That's a lot of state to manage for one guy and a cabin. And unlike a software system, there's no monitoring dashboard. There's no Datadog for "how much firewood is stacked by the shed." At least, there wasn't.


Why Google Cloud CLI + LLMs

I chose Google Cloud for a few practical reasons. First, I already had a GCP project running for some side work. Second, Google's CLI tooling (gcloud) is genuinely excellent for scripting. Third, BigQuery gives me a free tier that's more than sufficient for storing homesteading logs — we're talking kilobytes of data, not terabytes.

The LLM piece came in because I wanted natural language input. I didn't want to fill out forms or remember column names. I wanted to say "changed the generator oil today, about 150 hours on the meter, used Rotella T6 5W-40" and have the system figure out what to do with that.

Here's the architecture, if you can call it that:

  1. A simple Node.js CLI script that accepts natural language input
  2. An LLM call to parse and classify the input into structured data
  3. Google Cloud CLI commands to push the data into BigQuery
  4. A separate script to query and summarize logs on demand

Nothing fancy. No Kubernetes. No microservices. Just scripts that do useful things.


Setting Up the Foundation

First, you need a GCP project with BigQuery enabled and the gcloud CLI installed and authenticated. If you haven't done this before:

gcloud auth login
gcloud config set project your-project-id

Then create a dataset and tables. I have a few core tables:

bq mk --dataset homestead_logs

bq mk --table homestead_logs.weather_log \
  date:DATE,high_temp:FLOAT,low_temp:FLOAT,conditions:STRING,snowfall_inches:FLOAT,wind_mph:FLOAT,notes:STRING

bq mk --table homestead_logs.maintenance_log \
  date:DATE,system:STRING,action:STRING,parts_used:STRING,hours_reading:FLOAT,cost:FLOAT,notes:STRING

bq mk --table homestead_logs.supply_log \
  date:DATE,category:STRING,item:STRING,quantity:FLOAT,unit:STRING,action:STRING,notes:STRING

Simple schemas. No foreign keys. No normalization zealotry. This is a cabin log, not a banking system.


The Natural Language Input Parser

This is where the LLM earns its keep. I wrote a Node.js script that takes a plain-English description of whatever I did today and converts it into structured records.

var https = require("https");
var { execSync } = require("child_process");

var ANTHROPIC_API_KEY = process.env.ANTHROPIC_API_KEY;

function callClaude(prompt) {
  return new Promise(function(resolve, reject) {
    var body = JSON.stringify({
      model: "claude-haiku-4-20250514",
      max_tokens: 1024,
      messages: [{ role: "user", content: prompt }]
    });

    var options = {
      hostname: "api.anthropic.com",
      path: "/v1/messages",
      method: "POST",
      headers: {
        "Content-Type": "application/json",
        "x-api-key": ANTHROPIC_API_KEY,
        "anthropic-version": "2023-06-01"
      }
    };

    var req = https.request(options, function(res) {
      var data = "";
      res.on("data", function(chunk) { data += chunk; });
      res.on("end", function() {
        var parsed = JSON.parse(data);
        resolve(parsed.content[0].text);
      });
    });

    req.on("error", reject);
    req.write(body);
    req.end();
  });
}

function parseLogEntry(naturalText) {
  var prompt = "You are a homesteading log parser. Given the following natural language " +
    "log entry, extract structured data and return ONLY valid JSON with these fields:\n" +
    "- type: one of 'weather', 'maintenance', 'supply'\n" +
    "- date: ISO date string (use today if not specified: " + new Date().toISOString().split("T")[0] + ")\n" +
    "- fields: object with relevant fields based on type\n\n" +
    "For weather: high_temp, low_temp, conditions, snowfall_inches, wind_mph, notes\n" +
    "For maintenance: system, action, parts_used, hours_reading, cost, notes\n" +
    "For supply: category, item, quantity, unit, action (received/used/ordered), notes\n\n" +
    "Log entry: " + naturalText;

  return callClaude(prompt).then(function(response) {
    return JSON.parse(response);
  });
}

The key insight here is that I'm not trying to build a perfect NLP system. I'm using the LLM as a fuzzy parser. If I say "generator oil change, 150 hours, used Rotella," it figures out that's a maintenance entry for the generator system, the action is "oil change," and Rotella is the parts used. It gets it right about 95% of the time, which is way better than my sticky note system.


Pushing Data to BigQuery via CLI

Once I have structured data, I use the bq command-line tool to insert it. Here's the insertion function:

function insertRecord(parsed) {
  var table = "homestead_logs." + parsed.type + "_log";
  var fields = parsed.fields;
  fields.date = parsed.date;

  var columns = Object.keys(fields);
  var values = columns.map(function(key) {
    var val = fields[key];
    if (typeof val === "string") {
      return "'" + val.replace(/'/g, "\\'") + "'";
    }
    if (val === null || val === undefined) {
      return "NULL";
    }
    return val;
  });

  var query = "INSERT INTO " + table + " (" + columns.join(", ") + ") " +
    "VALUES (" + values.join(", ") + ")";

  try {
    execSync('bq query --use_legacy_sql=false "' + query + '"', {
      stdio: "pipe"
    });
    console.log("Record inserted into " + table);
    return true;
  } catch (err) {
    console.error("Insert failed:", err.message);
    return false;
  }
}

I know what you're thinking. Yes, I'm shelling out to bq instead of using the BigQuery Node.js client library. Here's my defense: the bq CLI is already authenticated via gcloud, I don't need another dependency, and this script runs on my laptop maybe twice a day. Performance is irrelevant. Simplicity wins.


The Daily Log Script

The main entry point ties it all together. I run this from the terminal — usually while I'm drinking coffee in the morning or wrapping up evening chores.

var readline = require("readline");

var rl = readline.createInterface({
  input: process.stdin,
  output: process.stdout
});

function promptAndLog() {
  rl.question("Log entry (or 'done' to quit): ", function(input) {
    if (input.toLowerCase() === "done") {
      console.log("Logs saved. Stay warm out there.");
      rl.close();
      return;
    }

    parseLogEntry(input).then(function(parsed) {
      console.log("Parsed as:", JSON.stringify(parsed, null, 2));
      rl.question("Save this? (y/n): ", function(confirm) {
        if (confirm.toLowerCase() === "y") {
          insertRecord(parsed);
        } else {
          console.log("Skipped.");
        }
        promptAndLog();
      });
    }).catch(function(err) {
      console.error("Parse error:", err.message);
      promptAndLog();
    });
  });
}

console.log("=== Homestead Log ===");
console.log("Enter observations in plain English. Type 'done' to finish.\n");
promptAndLog();

A typical session looks like this:

=== Homestead Log ===
Enter observations in plain English. Type 'done' to finish.

Log entry (or 'done' to quit): High of 12, low of -8, light snow all day maybe 3 inches, calm wind
Parsed as: {
  "type": "weather",
  "date": "2026-02-15",
  "fields": {
    "high_temp": 12,
    "low_temp": -8,
    "conditions": "light snow",
    "snowfall_inches": 3,
    "wind_mph": 0,
    "notes": "Snow throughout the day, calm conditions"
  }
}
Save this? (y/n): y
Record inserted into homestead_logs.weather_log

Log entry (or 'done' to quit): Swapped generator air filter, 380 hours on the clock
Parsed as: {
  "type": "maintenance",
  "date": "2026-02-15",
  "fields": {
    "system": "generator",
    "action": "air filter replacement",
    "parts_used": "air filter",
    "hours_reading": 380,
    "cost": null,
    "notes": null
  }
}
Save this? (y/n): y
Record inserted into homestead_logs.maintenance_log

It's not glamorous, but it works. Every single time.


Automated Weather Logging

I got tired of manually entering weather data, so I added a cron job that pulls weather from the National Weather Service API and logs it automatically.

var https = require("https");
var { execSync } = require("child_process");

// NWS API for Caswell Lakes area
var STATION_URL = "https://api.weather.gov/stations/PATK/observations/latest";

function fetchWeather() {
  return new Promise(function(resolve, reject) {
    https.get(STATION_URL, { headers: { "User-Agent": "homestead-log" } },
      function(res) {
        var data = "";
        res.on("data", function(chunk) { data += chunk; });
        res.on("end", function() {
          var obs = JSON.parse(data).properties;
          resolve({
            temp_c: obs.temperature.value,
            wind_ms: obs.windSpeed.value,
            conditions: obs.textDescription
          });
        });
      }
    ).on("error", reject);
  });
}

function logWeather() {
  fetchWeather().then(function(wx) {
    var tempF = (wx.temp_c * 9 / 5) + 32;
    var windMph = wx.wind_ms ? (wx.wind_ms * 2.237).toFixed(1) : 0;
    var today = new Date().toISOString().split("T")[0];

    var query = "INSERT INTO homestead_logs.weather_log " +
      "(date, high_temp, low_temp, conditions, wind_mph, notes) VALUES " +
      "('" + today + "', " + tempF.toFixed(1) + ", NULL, '" +
      wx.conditions + "', " + windMph + ", 'auto-logged from NWS')";

    execSync('bq query --use_legacy_sql=false "' + query + '"', {
      stdio: "pipe"
    });
    console.log("Weather logged for " + today);
  }).catch(function(err) {
    console.error("Weather fetch failed:", err.message);
  });
}

logWeather();

The NWS API is free, doesn't require authentication, and covers my area reasonably well. The nearest station isn't right at my cabin, but it's close enough for trend tracking. If the NWS says it's -20 and my thermometer says -25, I know the differential and can mentally adjust.


Querying and Summarizing With LLMs

This is where the system gets genuinely useful. I wrote a summary script that pulls recent logs from BigQuery and asks an LLM to generate a human-readable report.

function getRecentLogs(tableName, days) {
  var query = "SELECT * FROM homestead_logs." + tableName +
    " WHERE date >= DATE_SUB(CURRENT_DATE(), INTERVAL " + days + " DAY) " +
    "ORDER BY date DESC";

  var result = execSync('bq query --use_legacy_sql=false --format=json "' + query + '"', {
    encoding: "utf-8"
  });

  return JSON.parse(result);
}

function generateSummary(days) {
  var weather = getRecentLogs("weather_log", days);
  var maintenance = getRecentLogs("maintenance_log", days);
  var supplies = getRecentLogs("supply_log", days);

  var prompt = "You are a homestead management assistant. Analyze these logs from the past " +
    days + " days and provide:\n" +
    "1. Weather summary and trends\n" +
    "2. Maintenance items completed and any upcoming concerns\n" +
    "3. Supply status and anything running low\n" +
    "4. Any patterns or warnings I should pay attention to\n\n" +
    "Weather logs:\n" + JSON.stringify(weather, null, 2) + "\n\n" +
    "Maintenance logs:\n" + JSON.stringify(maintenance, null, 2) + "\n\n" +
    "Supply logs:\n" + JSON.stringify(supplies, null, 2);

  return callClaude(prompt);
}

// Generate a 30-day summary
generateSummary(30).then(function(summary) {
  console.log("\n=== 30-Day Homestead Summary ===\n");
  console.log(summary);
});

The summaries are surprisingly useful. The LLM catches patterns I miss. Things like: "Your generator has been running 30% more hours this month compared to last month — check if the solar panels need clearing." Or: "You've used propane at a rate that suggests your current tank will need refilling around March 8th."

That's not magic. That's just basic math applied to structured data by a model that's good at basic math. But it's basic math I wasn't doing because I was too busy shoveling snow and debugging production code.


Supply Tracking and Reorder Alerts

One of the most practical features is supply monitoring. Living 30 minutes from the nearest town means running out of something critical is more than an inconvenience — it's potentially dangerous.

function checkSupplyLevels() {
  var query = "SELECT category, item, " +
    "SUM(CASE WHEN action = 'received' THEN quantity ELSE 0 END) - " +
    "SUM(CASE WHEN action = 'used' THEN quantity ELSE 0 END) as current_stock, " +
    "unit FROM homestead_logs.supply_log " +
    "GROUP BY category, item, unit " +
    "HAVING current_stock > 0 " +
    "ORDER BY category, item";

  var result = execSync('bq query --use_legacy_sql=false --format=json "' + query + '"', {
    encoding: "utf-8"
  });

  var supplies = JSON.parse(result);

  var prompt = "Here are my current homestead supply levels. I live in a remote cabin " +
    "in Alaska, 30 minutes from town. Flag anything that looks dangerously low for " +
    "winter conditions and suggest a reorder list.\n\n" +
    JSON.stringify(supplies, null, 2);

  return callClaude(prompt);
}

The LLM adds context that a simple threshold check can't. It knows that 20 gallons of propane in January is a very different situation than 20 gallons in July. It knows that running low on water filters matters more when temperatures are below freezing because you can't easily work on the well system.


What I've Learned

After running this system for about four months, a few things stand out.

LLMs are great fuzzy parsers. The natural language input is the killer feature. I can type a log entry in whatever half-awake phrasing comes to mind at 6 AM in a cold cabin, and the system figures it out. Trying to build this with regex or traditional NLP would have taken ten times longer and worked half as well.

BigQuery is overkill, and that's fine. I'm using maybe 0.001% of BigQuery's capacity. But it's free at this scale, the CLI is clean, and SQL is a query language I already know. Sometimes the right tool is the one you don't have to think about.

The summary reports are the real value. Logging is only useful if you actually review the data. Having an LLM generate weekly summaries with actionable insights means the data actually gets used instead of sitting in a table forever.

Keep it simple. This whole system is about 300 lines of JavaScript across four files. No framework. No build step. No Docker. It runs on my laptop and could run on a Raspberry Pi. When you're troubleshooting at -30 degrees, you want as few moving parts as possible.


The Bigger Picture

I think there's something interesting happening at the intersection of old-school self-reliance and modern AI tooling. The homesteading community is stereotypically anti-tech, and the tech community is stereotypically disconnected from the physical world. But the skills transfer more than you'd expect.

Systems thinking is systems thinking, whether you're designing a microservice architecture or planning a winter fuel strategy. Monitoring is monitoring, whether it's server metrics or propane levels. And automation is automation, whether it's a CI/CD pipeline or a weather logging cron job.

I'm not suggesting everyone should move to Alaska and build log-parsing scripts. But if you do find yourself straddling two very different worlds, don't assume the tools from one can't help with the other. My cabin runs better because I'm an engineer. And my engineering is better because I spend half my time solving problems where the consequences are real and immediate — not just a Jira ticket.

If the pipes freeze, there's no rollback button. That kind of accountability makes you a better systems designer.


Shane Larson is a software engineer, technical author, and homesteader based in Caswell Lakes, Alaska. He writes about AI, software architecture, and the surprisingly overlapping worlds of off-grid living and cloud computing at grizzlypeaksoftware.com.

Powered by Contentful