Categories
homelab Kubernetes Linux proxmox

How to create a Proxmox Ubuntu cloud-init image

Background for why I wanted to make a Proxmox Ubuntu cloud-init image

I have recently ventured down the path of attempting to learn CI/CD concepts. I have tried docker multiple times and haven’t really enjoyed the nuances any of the times. To me, LXC/LXD containers are far easier to understand than Docker when coming from a ‘one VM per service’ background. LXC/LXD containers can be assigned IP addresses (or get them from DHCP) and otherwise behave basically exactly like a VM from a networking perspective. Docker’s networking model is quite a bit more nuanced. Lots of people say it’s easier, but having everything run on ‘localhost:[high number port]’ doesn’t work well when you’ve got lots of services, unless you do some reverse proxying, like with Traefik or similar. Which is another configuration step.

It is so much easier to just have a LXC get an IP via DHCP and then it’s accessible from hostname right off the bat (I use pfSense for DHCP/DNS – all DHCP leases are entered right into DNS). Regardless, I know Kubernetes is the new hotness so I figured I need to learn it. Every tutorial says you need a master and at least two worker nodes. No sense making three separate virtual machines – let’s use the magic of virtualization and clone some images! I plan on using Terraform to deploy the virtual machines for my Kubernetes cluster (as in, I’ve already used this Proxmox Ubuntu cloud-init image to make my own Kubernetes nodes but haven’t documented it yet).

Overview

The quick summary for this tutorial is:

  1. Download a base Ubuntu cloud image
  2. Install some packages into the image
  3. Create a Proxmox VM using the image
  4. Convert it to a template
  5. Clone the template into a full VM and set some parameters
  6. Automate it so it runs on a regular basis (extra credit)?
  7. ???
  8. Profit!

Youtube Video Link

If you prefer video versions to follow along, please head on over to https://youtu.be/1sPG3mFVafE for a live action video of me creating the Proxmox Ubuntu cloud-init image and why we’re running each command.

#1 – Downloading the base Ubuntu image

Luckily, Ubuntu (my preferred distro, guessing others do the same) provides base images that are updated on a regular basis – https://cloud-images.ubuntu.com/. We are interested in the “current” release of Ubuntu 20.04 Focal, which is the current Long Term Support version. Further, since Proxmox uses KVM, we will be pulling that image:

wget https://cloud-images.ubuntu.com/focal/current/focal-server-cloudimg-amd64.img

#2 – Install packages

It took me quite a while into my Terraform debugging process to determine that qemu-guest-agent wasn’t included in the cloud-init image. Why it isn’t, I have no idea. Luckily there is a very cool tool that I just learned about that enables installing packages directly into a image. The tool is called virt-customize and it comes in the libguestfs-tools package (“libguestfs is a set of tools for accessing and modifying virtual machine (VM) disk images” – https://www.libguestfs.org/).

Install the tools:

sudo apt update -y && sudo apt install libguestfs-tools -y

Then install qemu-guest-agent into the newly downloaded image:

sudo virt-customize -a focal-server-cloudimg-amd64.img --install qemu-guest-agent

At this point you can presumably install whatever else you want into the image but I haven’t tested installing other packages. qemu-guest-agent was what I needed to get the VM recognized by Terraform and accessible.

Update 2021-12-30 – it is possible to inject the SSH keys into the cloud image itself before turning it into a template and VM. You need to create a user first and the necessary folders:

# not quite working yet. skip this and continue
#sudo virt-customize -a focal-server-cloudimg-amd64.img --run-command 'useradd austin'
#sudo virt-customize -a focal-server-cloudimg-amd64.img --run-command 'mkdir -p /home/austin/.ssh'
#sudo virt-customize -a focal-server-cloudimg-amd64.img --ssh-inject austin:file:/home/austin/.ssh/id_rsa.pub
#sudo virt-customize -a focal-server-cloudimg-amd64.img --run-command 'chown -R austin:austin /home/austin'

#3 – Create a Proxmox virtual machine using the newly modified image

The commands here should be relatively self explanatory but in general we are creating a VM (VMID=9000, basically every other resource I saw used this ID so we will too) with basic resources (2 cores, 2048MB), assigning networking to a virtio adapter on vmbr0, importing the image to storage (your storage here will be different if you’re not using ZFS, probably either ‘local’ or ‘local-lvm’), setting disk 0 to use the image, setting boot drive to disk, setting the cloud init stuff to ide2 (which is apparently appears as a CD-ROM to the VM, at least upon inital boot), and adding a virtual serial port. I had only used qm to force stop VMs before this but it’s pretty useful.

sudo qm create 9000 --name "ubuntu-2004-cloudinit-template" --memory 2048 --cores 2 --net0 virtio,bridge=vmbr0
sudo qm importdisk 9000 focal-server-cloudimg-amd64.img local-zfs
sudo qm set 9000 --scsihw virtio-scsi-pci --scsi0 local-zfs:vm-9000-disk-0
sudo qm set 9000 --boot c --bootdisk scsi0
sudo qm set 9000 --ide2 local-zfs:cloudinit
sudo qm set 9000 --serial0 socket --vga serial0
sudo qm set 9000 --agent enabled=1

You can start the VM up at this point if you’d like and make any other changes you want because the next step is converting it to a template. If you do boot it, I will be completely honest I have no idea how to log into it. I actually just googled this because I don’t want to leave you without an answer – looks like you can use the same virt-customize we used before to set a root password according to stackoverflow (https://stackoverflow.com/questions/29137679/login-credentials-of-ubuntu-cloud-server-image). Not going to put that into a command window here because cloud-init is really meant for public/private key authentication (see post here for a quick SSH tutorial).

#4 – Convert VM to a template

Ok if you made any changes, shut down the VM. If you didn’t boot the VM, that’s perfectly fine also. We need to convert it to a template:

sudo qm template 9000

And now we have a functioning template!

screenshot of proxmox ui showing ubuntu 20.04 cloud-init template

#5 – Clone the template into a full VM and set some parameters

From this point you can clone the template as much as you want. But, each time you do so it makes sense to set some parameters, namely the SSH keys present in the VM as well as the IP address for the main interface. You could also add the SSH keys with virt-customize but I like doing it here.

First, clone the VM (here we are cloning the template with ID 9000 to a new VM with ID 999):

sudo qm clone 9000 999 --name test-clone-cloud-init

Next, set the SSH keys and IP address:

sudo qm set 999 --sshkey ~/.ssh/id_rsa.pub
sudo qm set 999 --ipconfig0 ip=10.98.1.96/24,gw=10.98.1.1

It’s now ready to start up!

sudo qm start 999

You should be able to log in without any problems (after trusting the SSH fingerprint). Note that the username is ‘ubuntu’, not whatever the username is for the key you provided.

ssh [email protected]

Once you’re happy with how things worked, you can stop the VM and clean up the resources:

sudo qm stop 999 && sudo qm destroy 999
rm focal-server-cloudimg-amd64.img

#6 – automating the process

I have not done so yet, but if you create VMs on a somewhat regular basis, it wouldn’t be hard to stick all of the above into a simple shell script (update 2022-04-19: simple shell script below) and run it via cron on a weekly basis or whatever frequency you prefer. I can’t tell you how many times I make a new VM from whatever .iso I downloaded and the first task is apt upgrade taking forever to run (‘sudo apt update’ –> “176 packages can be upgraded”). Having a nice template always ready to go would solve that issue and would frankly save me a ton of time.

#6.5 – Shell script to create template

# installing libguestfs-tools only required once, prior to first run
sudo apt update -y
sudo apt install libguestfs-tools -y

# remove existing image in case last execution did not complete successfully
rm focal-server-cloudimg-amd64.img
wget https://cloud-images.ubuntu.com/focal/current/focal-server-cloudimg-amd64.img
sudo virt-customize -a focal-server-cloudimg-amd64.img --install qemu-guest-agent
sudo qm create 9000 --name "ubuntu-2004-cloudinit-template" --memory 2048 --cores 2 --net0 virtio,bridge=vmbr0
sudo qm importdisk 9000 focal-server-cloudimg-amd64.img local-zfs
sudo qm set 9000 --scsihw virtio-scsi-pci --scsi0 local-zfs:vm-9000-disk-0
sudo qm set 9000 --boot c --bootdisk scsi0
sudo qm set 9000 --ide2 local-zfs:cloudinit
sudo qm set 9000 --serial0 socket --vga serial0
sudo qm set 9000 --agent enabled=1
sudo qm template 9000
rm focal-server-cloudimg-amd64.img
echo "next up, clone VM, then expand the disk"
echo "you also still need to copy ssh keys to the newly cloned VM"

#7-8 – Using this template with Terraform to automate VM creation

Next post – How to deploy VMs in Proxmox with Terraform

References

https://matthewkalnins.com/posts/home-lab-setup-part-1-proxmox-cloud-init/
https://registry.terraform.io/modules/sdhibit/cloud-init-vm/proxmox/latest/examples/ubuntu_single_vm

My original notes

https://matthewkalnins.com/posts/home-lab-setup-part-1-proxmox-cloud-init/
https://registry.terraform.io/modules/sdhibit/cloud-init-vm/proxmox/latest/examples/ubuntu_single_vm

# create cloud image VM
wget https://cloud-images.ubuntu.com/focal/20210824/focal-server-cloudimg-amd64.img
sudo qm create 9000 --name "ubuntu-2004-cloudinit-template" --memory 2048 --cores 2 --net0 virtio,bridge=vmbr0

# to install qemu-guest-agent or whatever into the guest image
#sudo apt-get install libguestfs-tools
#virt-customize -a focal-server-cloudimg-amd64.img --install qemu-guest-agent
sudo qm importdisk 9000 focal-server-cloudimg-amd64.img local-zfs
sudo qm set 9000 --scsihw virtio-scsi-pci --scsi0 local-zfs:vm-9000-disk-0
sudo qm set 9000 --boot c --bootdisk scsi0
sudo qm set 9000 --ide2 local-zfs:cloudinit
sudo qm set 9000 --serial0 socket --vga serial0
sudo qm template 9000

# clone cloud image to new VM
sudo qm clone 9000 999 --name test-clone-cloud-init
sudo qm set 999 --sshkey ~/.ssh/id_rsa.pub
sudo qm set 999 --ipconfig0 ip=10.98.1.96/24,gw=10.98.1.1
sudo qm start 999
  
# remove known host because SSH key changed
ssh-keygen -f "/home/austin/.ssh/known_hosts" -R "10.98.1.96"

# ssh in
ssh -i ~/.ssh/id_rsa [email protected]

# stop and destroy VM
sudo qm stop 999 && sudo qm destroy 999
Categories
Offgrid Solar Uncategorized

Sending MPP inverter data to MQTT and InfluxDB with Python

Catching up

Hey there, welcome back to Austin’s Nerdy Things. It’s been a while since my last post – life happens. I haven’t lost sight of the blog. Just needed to do some other things and finish up some projects before documenting stuff.

Background

If you recall, my DIY hybrid solar setup with battery backup is up and running. I wrote about it here – DIY solar with battery backup – up and running!

The MPP LV2424 inverter I’m using puts out a lot of data. Some of it is quite useful, some of it not so much. Regardless, I am capturing all of it in my InfluxDB database with Python. This allows me to chart it in Grafana, which I use for basic monitoring stuff around the house. This post will document getting data from the MPP inverter to Grafana.

Jumping ahead

Final product first. This is a screenshot showing my complete (work-in-progress) Grafana dashboard for my DIY hybrid solar setup.

screenshot showing Grafana dashboard which contains various charts and graphs of MPP solar inverter voltages and datat
Solar dashboard as seen in Grafana pulling data from InfluxDB

The screenshot shows the last 6 hours worth of data from my system. It was a cloudy and then stormy afternoon/evening here in Denver (we got 0.89 inches of rain since midnight as of typing this post!), so the solar production wasn’t great. The panels are as follows:

  • Temperatures – showing temperatures from 3 sources: the MPP LV2424 inverter heat sink, and both BMS temperature probes (one is on the BMS board itself, the other is a probe on the battery bank). The inverter has at least 2 temperature readings, maybe 3. They all basically show the same thing.
  • Solar input – shows solar voltage as seen by the solar charge controller as well as solar power (in watts) going through the charge controller.
  • Cell voltages – the voltage reading of each of my 8 battery bank cells as reported by the BMS. The green graph also shows the delta between max and min cells. They are still pretty balanced in the flat part of the discharge curve.
  • Total power – a mashup of what the inverter is putting out, what the solar is putting in, the difference between the two, and what the BMS is reporting. I’m still trying to figure out all the nuances here. There is definitely a discrepancy between what the inverter is putting out and what the solar is putting in when the batteries are fully charged. I believe the difference is the power required to keep the transformer energized (typically ranges from 30-60W).
  • DC voltages – as reported by the inverter and the BMS. The inverter is accurate to 0.1V, the BMS goes to 0.01V.
  • AC voltages – shows the input voltage from the grid and the output voltage from the inverter. These will match unless the inverter is disconnected from the grid.
  • Data table – miscellaneous information from the inverter that isn’t graphable
  • Output load – how much output I’m using compared to the inverter’s limit
  • Total Generation – how much total energy has been captured by the inverter/solar panels. This is limited because I’m not back feeding the grid.

Getting data out of the MPP Solar LV2424 inverter with Python to MQTT

I am using two cables to plug the inverter into a computer. The first is the serial cable that came with the inverter. The second is a simple USB to RS232 serial adapter plugged into a Dell Micro 3070.

The computer is running Proxmox, which is a virtual machine hypervisor. That doesn’t matter for this post, we can ignore it. Just pretend the USB cable is plugged directly into a computer running Ubuntu 20.04 Linux.

The main bit of software I’m using is published on GitHub by jblance under the name ‘mpp-solar’ – https://github.com/jblance/mpp-solar. This is a utility written in Python to communicate with MPP inverters.

There was a good bit of fun trying to figure out exactly what command I needed to run to get data out of the inverter as evidenced by the history command:

Trying to figure out the usage of the Python mpp-solar utility

In the end, what worked for me to get the data is the following. I believe the protocol (-P) will be different for different MPP Solar inverters:

sudo mpp-solar -p /dev/ttyUSB0 -b 2400 -P PI18 --getstatus

And the results are below. The grid voltage reported is a bit low because the battery started charging from the grid a few minutes before this was run.

austin@mpp-linux:~/mpp-solar-python$ sudo mpp-solar -p /dev/ttyUSB0 -b 2400 -P PI18 --getstatus
Command: ET - Total Generated Energy query
------------------------------------------------------------
Parameter                       Value           Unit
working_mode                    Hybrid mode(Line mode, Grid mode)
grid_voltage                    111.2           V
grid_frequency                  59.9            Hz
ac_output_voltage               111.2           V
ac_output_frequency             59.9            Hz
ac_output_apparent_power        155             VA
ac_output_active_power          139             W
output_load_percent             5               %
battery_voltage                 27.1            V
battery_voltage_from_scc        0.0             V
battery_voltage_from_scc2       0.0             V
battery_discharge_current       0               A
battery_charging_current        19              A
battery_capacity                76              %
inverter_heat_sink_temperature  30              °C
mppt1_charger_temperature       0               °C
mppt2_charger_temperature       0               °C
pv1_input_power                 0               W
pv2_input_power                 0               W
pv1_input_voltage               0.0             V
pv2_input_voltage               0.0             V
setting_value_configuration_state       Something changed
mppt1_charger_status            abnormal
mppt2_charger_status            abnormal
load_connection                 connect
battery_power_direction         charge
dc/ac_power_direction           AC-DC
line_power_direction            input
local_parallel_id               0
total_generated_energy          91190           Wh

And to get the same data right into MQTT I am using the following:

sudo mpp-solar -p /dev/ttyUSB0 -P PI18 --getstatus -o mqtt -q mqtt --mqtttopic mpp

The above command is being run as a cron job once a minute. The default baud rate for the inverter is 2400 bps (yes, bits per second), which is super slow so a full poll takes ~6 seconds. Kind of annoying in 2021 but not a huge problem. The cron entry for the command is this:

# this program feeds a systemd service to convert the outputted mqtt to influx points
* * * * * /usr/local/bin/mpp-solar -p /dev/ttyUSB0 -P PI18 --getstatus -o mqtt -q mqtt --mqtttopic mpp

So with that we have MPP inverter data going to MQTT.

Putting MPP data into InfluxDB from MQTT

Here we need another script written in… take a guess… Python! This Python basically just opens a connection to a MQTT broker and transmits any updates to InfluxDB. The full script is a bit more complicated and I actually stripped a lot out because my MQTT topic names didn’t fit the template the original author used. I have started using this framework in other places to do the MQTT to InfluxDB translation. I like things going to the intermediate MQTT so they can be picked up for easy viewing in Home Assistant. Original code from https://github.com/KHoos/mqtt-to-influxdb-forwarder. The original code seems like it was built to be more robust than what I’m using it for (read: I have no idea what half of it does) but it worked for my purposes.

You’ll need a simple text file with your InfluxDB password and then reference it in the arguments. If your password is ‘password’, the only contents of the file should be ‘password’. I added the isFloat() function to basically make sure that strings weren’t getting added to the numeric tables in InfluxDB. I honestly find the structure/layout of storing stuff in Influx quite confusing so I’m sure there’s a better way to do this.

#!/usr/bin/env python
# -*- coding: utf-8 -*-
####################################
# originally found at/modified from https://github.com/KHoos/mqtt-to-influxdb-forwarder
####################################

# forwarder.py - forwards IoT sensor data from MQTT to InfluxDB
#
# Copyright (C) 2016 Michael Haas <[email protected]>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301  USA

import argparse
import paho.mqtt.client as mqtt
from influxdb import InfluxDBClient
import json
import re
import logging
import sys
import requests.exceptions


class MessageStore(object):

        def store_msg(self, node_name, measurement_name, value):
                raise NotImplementedError()

class InfluxStore(MessageStore):

        logger = logging.getLogger("forwarder.InfluxStore")

        def __init__(self, host, port, username, password_file, database):
                password = open(password_file).read().strip()
                self.influx_client = InfluxDBClient(
                        host=host, port=port, username=username, password=password, database=database)

        def store_msg(self, database, sensor, value):
                influx_msg = {
                        'measurement': database,
                        'tags': {'sensor': sensor},
                        'fields': {'value' : value}
                }
                self.logger.debug("Writing InfluxDB point: %s", influx_msg)
                try:
                        self.influx_client.write_points([influx_msg])
                except requests.exceptions.ConnectionError as e:
                        self.logger.exception(e)

class MessageSource(object):

        def register_store(self, store):
                if not hasattr(self, '_stores'):
                        self._stores = []
                self._stores.append(store)

        @property
        def stores(self):
                # return copy
                return list(self._stores)

def isFloat(str_val):
  try:
    float(str_val)
    return True
  except ValueError:
    return False

def convertToFloat(str_val):
    if isFloat(str_val):
        fl_result = float(str_val)
        return fl_result
    else:
        return str_val

class MQTTSource(MessageSource):

        logger = logging.getLogger("forwarder.MQTTSource")

        def __init__(self, host, port, node_names, stringify_values_for_measurements):
                self.host = host
                self.port = port
                self.node_names = node_names
                self.stringify = stringify_values_for_measurements
                self._setup_handlers()

        def _setup_handlers(self):
                self.client = mqtt.Client()

                def on_connect(client, userdata, flags, rc):
                        self.logger.info("Connected with result code  %s", rc)
                        # subscribe to /node_name/wildcard
                        #for node_name in self.node_names:
                        # topic = "{node_name}/#".format(node_name=node_name)
                        topic = "get_status/status/#"
                        self.logger.info("Subscribing to topic %s", topic)
                        client.subscribe(topic)

                def on_message(client, userdata, msg):
                        self.logger.debug("Received MQTT message for topic %s with payload %s", msg.topic, msg.payload)
                        list_of_topics = msg.topic.split('/')
                        measurement = list_of_topics[2]
                        if list_of_topics[len(list_of_topics)-1] == 'unit':
                                value = None
                        else:
                                value = msg.payload
                                decoded_value = value.decode('UTF-8')
                                if isFloat(decoded_value):
                                        str_value = convertToFloat(decoded_value)
                                        for store in self.stores:
                                                store.store_msg("power_measurement",measurement,str_value)
                                else:
                                        for store in self.stores:
                                                store.store_msg("power_measurement_strings",measurement,decoded_value)





                self.client.on_connect = on_connect
                self.client.on_message = on_message

        def start(self):
                print(f"starting mqtt on host: {self.host} and port: {self.port}")
                self.client.connect(self.host, self.port)
                # Blocking call that processes network traffic, dispatches callbacks and
                # handles reconnecting.
                # Other loop*() functions are available that give a threaded interface and a
                # manual interface.
                self.client.loop_forever()


def main():
        parser = argparse.ArgumentParser(
                description='MQTT to InfluxDB bridge for IOT data.')
        parser.add_argument('--mqtt-host', default="mqtt", help='MQTT host')
        parser.add_argument('--mqtt-port', default=1883, help='MQTT port')
        parser.add_argument('--influx-host', default="dashboard", help='InfluxDB host')
        parser.add_argument('--influx-port', default=8086, help='InfluxDB port')
        parser.add_argument('--influx-user', default="power", help='InfluxDB username')
        parser.add_argument('--influx-pass', default="<I have a password here, unclear if the pass-file takes precedence>", help='InfluxDB password')
        parser.add_argument('--influx-pass-file', default="/home/austin/mpp-solar-python/pass.file", help='InfluxDB password file')
        parser.add_argument('--influx-db', default="power", help='InfluxDB database')
        parser.add_argument('--node-name', default='get_status', help='Sensor node name', action="append")
        parser.add_argument('--stringify-values-for-measurements', required=False,      help='Force str() on measurements of the given name', action="append")
        parser.add_argument('--verbose', help='Enable verbose output to stdout', default=False, action='store_true')
        args = parser.parse_args()

        if args.verbose:
                logging.basicConfig(stream=sys.stdout, level=logging.DEBUG)
        else:
                logging.basicConfig(stream=sys.stdout, level=logging.WARNING)

        print("creating influxstore")
        store = InfluxStore(host=args.influx_host, port=args.influx_port, username=args.influx_user, password_file=args.influx_pass_file, database=args.influx_db)
        print("creating mqttsource")
        source = MQTTSource(host=args.mqtt_host,
                                                port=args.mqtt_port, node_names=args.node_name,
                                                stringify_values_for_measurements=args.stringify_values_for_measurements)
        print("registering store")
        source.register_store(store)
        print("start")
        source.start()

if __name__ == '__main__':
        main()

Running the MQTT to InfluxDB script as a system daemon

Next up, we need to run the MQTT to InfluxDB Python script as a daemon so it starts with the machine and runs in the background. If you haven’t noticed by now, this is the standard pattern for most of the stuff I do – either a cron job or daemon to get data and another daemon to put it where I want it. Sometimes they’re the same.

austin@mpp-linux:~$ cat /etc/systemd/system/mpp-solar.service
[Unit]
Description=MPP inverter data - MQTT to influx
After=multi-user.target

[Service]
User=austin
Type=simple
Restart=always
RestartSec=10
# data feeds this script from a root cronjob running every 60s
ExecStart=/usr/bin/python3 /home/austin/mpp-solar-python/main.py

[Install]
WantedBy=multi-user.target

Then activate it:

austin@mpp-linux:~$ sudo systemctl daemon-reload
austin@mpp-linux:~$ sudo systemctl enable mpp-solar.service
austin@mpp-linux:~$ sudo systemctl start mpp-solar.service
austin@mpp-linux:~$ sudo systemctl status mpp-solar.service
● mpp-solar.service - MPP inverter data - MQTT to influx
     Loaded: loaded (/etc/systemd/system/mpp-solar.service; enabled; vendor preset: enabled)
     Active: active (running) since Sat 2021-07-31 02:21:32 UTC; 1 day 13h ago
   Main PID: 462825 (python3)
      Tasks: 1 (limit: 1072)
     Memory: 19.2M
     CGroup: /system.slice/mpp-solar.service
             └─462825 /usr/bin/python3 /home/austin/mpp-solar-python/main.py

Jul 31 02:21:32 mpp-linux systemd[1]: Started MPP inverter data - MQTT to influx.

All done

With that, you should now have data flowing into your InfluxDB instance from your MPP inverter via this Python script. This is exactly what I’m using for my LV2424 but it should work with others like the PIP LV2424 MSD, PIP-4048MS, IPS stuff, LV5048, and probably a lot of others.

Next Steps

Next post will cover designing the Grafana dashboard to show this data.

Categories
Programming

CS193p Spring 2021 Lecture 4 & Assignment 2

After lecture 4

A lot of this stuff still isn’t making a ton of sense to me. I really struggled with how to init the theme for assignment 2. The key was optionals. How to do it came to me in the shower. I am writing this post after doing the changes from Lecture 4, Assignment 2, and Lecture 5 so I don’t have a whole lot specifically around lecture 4.

Code

The viewModel is now hooked up to both the View and the Model. This MVVM stuff is clicking for me, thankfully.

MemoryGame.swift – includes scoring:

//
//  MemoryGame.swift
//  Memorize
//
//  Created by Austin on 5/28/21.
//  austinsnerdythings.com

import Foundation

// model
struct MemoryGame<CardContent> where CardContent: Equatable {
    private(set) var cards: Array<Card>
    private var indexOfTheOneAndOnlyFaceUpCard: Int?
    private(set) var score = 0
    
    mutating func choose(_ card: Card) {
        if let chosenIndex = cards.firstIndex(where: { $0.id == card.id }),
           !cards[chosenIndex].isFaceUp,
           !cards[chosenIndex].isMatched
        {
            if let potentialMatchIndex = indexOfTheOneAndOnlyFaceUpCard {
                cards[chosenIndex].hasBeenSeenThisManyTimes += 1
                cards[potentialMatchIndex].hasBeenSeenThisManyTimes += 1
                if cards[chosenIndex].content == cards[potentialMatchIndex].content {
                    // match
                    cards[chosenIndex].isMatched = true
                    cards[potentialMatchIndex].isMatched = true
                    score += 2
                } else if cards[chosenIndex].hasBeenSeenThisManyTimes > 1 ||
							cards[potentialMatchIndex].hasBeenSeenThisManyTimes > 1 {
					// mismatch
					 score -= 1
				}
                indexOfTheOneAndOnlyFaceUpCard = nil
            } else {
                for index in cards.indices {
                    cards[index].isFaceUp = false
                }
                indexOfTheOneAndOnlyFaceUpCard = chosenIndex
            }
            
            cards[chosenIndex].isFaceUp.toggle()
        }
        print("\(cards)")
    }
    
    init(numberOfPairsOfCards: Int, createCardContent: (Int) -> CardContent) {
        cards = Array<Card>()
        // add number of pairs of cards x 2 cards to card array
        for pairIndex in 0..<numberOfPairsOfCards {
            let content: CardContent = createCardContent(pairIndex)
            cards.append(Card(content: content, id: pairIndex*2))
            cards.append(Card(content: content, id: pairIndex*2+1))

        }
        cards.shuffle()
    }
    
    struct Card: Identifiable {
        var isFaceUp: Bool = false
        var isMatched: Bool = false
        var content: CardContent
        var id: Int
        var hasBeenSeenThisManyTimes: Int = 0
    }
}

EmojiMemoryGame.swift – we’ve moved the theme stuff into its own struct/file

//
//  EmojiMemoryGame.swift
//  Memorize
//
//  Created by Austin on 5/28/21.
//  austinsnerdythings.com

import SwiftUI

// viewModel
class EmojiMemoryGame: ObservableObject {
    @Published private var gameModel: MemoryGame<String>
    private(set) var theme: Theme
    
    static func createMemoryGame(theme: Theme) -> MemoryGame<String> {
        let emojis: Array<String> = theme.emojis.shuffled()
		var cardsToShow = theme.numberOfPairsOfCards ?? Int.random(in: 3...theme.emojis.count)
		if cardsToShow > theme.emojis.count {
			cardsToShow = theme.emojis.count
		}
        return MemoryGame<String>(numberOfPairsOfCards: cardsToShow) { pairIndex in
            emojis[pairIndex]
        }
    }
    
	init(startingTheme: Theme? = nil)
    {
		let selectedTheme = startingTheme ?? themes.randomElement()!
		self.theme = selectedTheme
		gameModel = EmojiMemoryGame.createMemoryGame(theme: selectedTheme)
    }

    var cards: Array<MemoryGame<String>.Card> {
        return gameModel.cards
    }
	
	var score: Int {
		return gameModel.score
	}
    
    // MARK: - INTENTS
    func choose(_ card: MemoryGame<String>.Card) {
        gameModel.choose(card)
    }
    
    func startNewGame() {
        let newTheme = themes.randomElement()!
		self.theme = newTheme
		gameModel = EmojiMemoryGame.createMemoryGame(theme: newTheme)
    }
}

MemorizeApp.swift – added the viewModel argument to the init here

//
//  MemorizeApp.swift
//  Memorize
//
//  Created by Austin on 5/25/21.
//  austinsnerdythings.com

import SwiftUI

@main
struct MemorizeApp: App {
    let game = EmojiMemoryGame()
    
    var body: some Scene {
        WindowGroup {
            ContentView(viewModel: game)
        }
    }
}

Theme.swift

//
//  Theme.swift
//  Memorize
//
//  Created by Austin on 6/7/21.
//

import Foundation
import SwiftUI

//    struct Theme: Identifiable {
struct Theme {
    var name: String
    var emojis: [String]
    var numberOfPairsOfCards: Int?
    var baseColor: Color
}

let themes: [Theme] = [
	Theme(name: "vehicles",
		  emojis: ["?","?","?","?","?","?","?","?","?","?","?","?","?","✈️","?","?","?","?","?","?","?","?","?","?"],
		  baseColor: Color.red),
	Theme(name: "fruits",
		  emojis: ["?","?","?","?","?","?","?","?","?","?","?","?"],
		  baseColor: Color.yellow),
	Theme(name: "animals",
		  emojis: ["?","?","?","?","?","?","?","?","?","?","?","?"],
		  numberOfPairsOfCards: 20,
		  baseColor: Color.blue)

]

ContentView.swift

//
//  ContentView.swift
//  Memorize - Stanford CS193p, Spring 2021
//  After assignment 1
//
//  Created by Austin from austinsnerdythings.com on 5/27/21.
//

import SwiftUI

// view
struct ContentView: View {
    @ObservedObject var viewModel: EmojiMemoryGame
    
    var body: some View {
        VStack {
			HStack {
				Text("Memorize!").font(.largeTitle)
				Spacer()
				HStack {
					VStack {
						Text(viewModel.theme.name).font(.title)
						Text("Score: \(viewModel.score)")
					}
					Button("New Game") {
						viewModel.startNewGame()
					}
				}
				
			}
            ScrollView {
                LazyVGrid(columns: [GridItem(.adaptive(minimum: 80))]){
                    ForEach(viewModel.cards[0..<viewModel.cards.count]) { card in
                        CardView(card: card)
                            .aspectRatio(2/3, contentMode: .fit)
                            .onTapGesture {
                                viewModel.choose(card)
                            }
                    }
                }
            }
            .foregroundColor(viewModel.theme.baseColor)
            .font(.largeTitle)
            .padding(.horizontal)
        }
    }
}

struct CardView: View {
    let card: MemoryGame<String>.Card
    var body: some View {
        ZStack {
            let shape = RoundedRectangle(cornerRadius: 20)
            if card.isFaceUp {
                shape.fill().foregroundColor(.white)
                shape.strokeBorder(lineWidth: 3)
                Text(card.content).font(.largeTitle)
            } else if card.isMatched {
                shape.opacity(0)
            } else {
                shape.fill()
            }
        }
    
    }
}

struct ContentView_Previews: PreviewProvider {
    static var previews: some View {
        let game = EmojiMemoryGame()
        ContentView(viewModel: game)
            .preferredColorScheme(.light)
        ContentView(viewModel: game)
            .preferredColorScheme(.dark)
    }
}

References

I gained some inspiration (and cleared up a lot of confusion) from two GitHub repos:

Conclusion

Still a slog. Still learning. CS193p Spring 2021 Lecture 4 is probably where I would start wondering if I should drop the class if I was a Stanford student. The stuff from lecture 5 (post coming up) where Professor took 20 lines and shrunk it to 2 is still a bit much for me. He says it improves readability. It does, but stuffing everything into a single line does hinder debugging.

Categories
Programming

Learning Swift – CS193p Spring 2021 Lecture 3

After lecture 3

So lecture 3 really pointed out to me where/why I had trouble learning Swift the first time around. The shortened closures and the whole “if this is the last argument for the function call, drop it” thing don’t make it easy for people new to the language. Words/phrases that came to mind when I realized what happened include “cute”, “nuanced”, and “too concise”. I wrote a whole post about it here. This lecture series will get me back on the right track to learn Swift. Link to after lecture 2/assignment post here.

Code

The code compiles. We did not hook the new model or viewmodel up to the view yet so no updates on the UI with this post. This is the first post where the code will be split across multiple files. (This means I should move to github or something similar.)

MemoryGame.swift

//
//  MemoryGame.swift
//  Memorize
//
//  Created by Austin on 5/28/21.
//  austinsnerdythings.com

import Foundation

struct MemoryGame<CardContent> {
    private(set) var cards: Array<Card>
    
    func choose(_ card: Card) {
        // this is where the game logic will go
    }
    
    init(numberOfPairsOfCards: Int, createCardContent: (Int) -> CardContent) {
        cards = Array<Card>()
        // add number of pairs of cards x 2 cards to card array
        for pairIndex in 0..<numberOfPairsOfCards {
            let content: CardContent = createCardContent(pairIndex)
            cards.append(Card(content: content))
            cards.append(Card(content: content))

        }
    }
    
    struct Card {
        var isFaceUp: Bool = false
        var isMatched: Bool = false
        var content: CardContent
    }
}

EmojiMemoryGame.swift

//
//  EmojiMemoryGame.swift
//  Memorize
//
//  Created by Austin on 5/28/21.
//  austinsnerdythings.com

import SwiftUI


class EmojiMemoryGame {
    static let emojis = ["?","?","?","?","?","?","?","?","?","?","?","?","?","✈️","?","?","?","?","?","?","?","?","?","?"]
    
    static func createMemoryGame() -> MemoryGame<String> {
        return MemoryGame<String>(numberOfPairsOfCards: 4) { pairIndex in
            EmojiMemoryGame.emojis[pairIndex]
        }
    }
    
private var model: MemoryGame<String> =
    MemoryGame<String>(numberOfPairsOfCards: 4) { _ in "A" }
    
    var cards: Array<MemoryGame<String>.Card> {
        return model.cards
    }
}

MemorizeApp.swift

//
//  MemorizeApp.swift
//  Memorize
//
//  Created by Austin on 5/25/21.
//  austinsnerdythings.com

import SwiftUI

@main
struct MemorizeApp: App {
    var body: some Scene {
        WindowGroup {
            ContentView()
        }
    }
}

ContentView.swift is basically the same as last lecture. It wasn’t touched much (if at all) so I won’t include it. CS193p Spring 2021 lecture 3 was mostly about some Swift ideas and MVVM, not the view for the app.

Conclusion

I am super glad I started watching these lectures so they could get me going in the right direction for Swift. Sure I’m a bit frustrated now because I realize where I went wrong, but I’m excited to get back to lecture 4 and the assignment after this long weekend.

Categories
Programming

Learning Swift – Confusion on Conciseness

Is Swift too concise for beginners?

This is not the first time I’ve tried to learn Swift. The first go took place maybe October/November 2020. I followed the official Apple Landmarks tutorial (called Creating and Combining Views) and things just did not click. I looked elsewhere for tutorials as well. My wife and I also had our first 3 month old around so my brain wasn’t functioning 100%. Regardless, I could follow the Landmarks tutorial, but not really step out on my own. The words I used to describe Swift to myself were “too cute” and “nuanced” and other things like that. After watching the Stanford CS193p Spring 2021 Lecture 3 video, there was a 2 minute section that really cleared things up for me. My background is mostly C# with some Python so that’s where I’m coming from.

Shortening things up

At 1:33:43 in the lecture, Professor Hegarty is finished taking two completely reasonable functions and chopping out more than half the characters. The resulting combination functions exactly the same as the two larger functions.

Before:

func makeCardContent(index: Int) -> String {
    return "A"
}
private var model: MemoryGame<String> =
    MemoryGame<String>(numberOfPairsOfCards: 4, createCardContent: makeCardContent)

Middle:

private var model: MemoryGame<String> =
    MemoryGame<String>(numberOfPairsOfCards: 4, createCardContent: {(index: Int) -> String in
        return "A"
    })

After:

private var model: MemoryGame<String> =
    MemoryGame<String>(numberOfPairsOfCards: 4) { _ in "A" }

The theme of the code reduction is ‘taking out things that Swift already knows’ as well as the ‘if this is the last argument of a function, plop the function in its place’. Let’s examine that for a minute.

The last argument of a function thing is really an if-then that you need perform mentally while writing code. When learning a language, it isn’t particularly easy to figure out what’s going on when all the code snippets are already fully reduced. Maybe I missed a key page in the documentation, but this wasn’t made clear to me in any of the learning I attempted to do. It could also just be that I don’t understand how functional programming is supposed to work.

That same code in C# (at least for the versions I use) would be a lot more clear to read. Everything would be specified, unless you precede a variable with var, which indicates that you want the compiler to infer the type. Being able to decide is nice.

When does concise become confusing?

All that said, I still think “cute” and “nuanced” are appropriate for describing Swift (at least SwiftUI). It tries to be cute by cutting out thing where other languages just leave them in. The underbar (_) when you don’t need to specify an argument is another example of this. Why not just make every argument label optional unless specifically called out as necessary?

The other thing is the mix between Swift being a strongly-typed language as well as type inferencing. If it is strongly-typed, we should need to specify the type basically everywhere. Leaving out the types and letting the compiler inference them seems to work really well (I know the compilers are all much smarter than me) but it doesn’t help readability.

Conclusion

Are these valid criticisms? I don’t know. If a Swift expert wants to watch me (via screensharing) work on some basic cryptocurrency tracking app I have going for 30-60 minutes to answer my questions and help me learn Swift (I would pay $$$!), I would love that. Swift will make more sense the more I write it, I know that, but I’m left wondering if I’ll always have these thoughts. Beautiful Swift is indeed beautiful. I just need to figure out how to get there.