<?xml version="1.0" encoding="UTF-8"?><rss version="2.0" xmlns:content="http://purl.org/rss/1.0/modules/content/">
  <channel>
    <title>onyx, a tech blog</title>
    <link>https://blog.onyxandiris.online/</link>
    <description>thoughts related to programming and linux</description>
    <pubDate>Mon, 16 Mar 2026 12:00:58 +0000</pubDate>
    <item>
      <title>The VBAN TEXT/SERVICE Subprotocols</title>
      <link>https://blog.onyxandiris.online/the-vban-text-service-subprotocols</link>
      <description>&lt;![CDATA[If you&#39;re familiar with Voicemeeter then you&#39;ve probably heard of VBAN. It&#39;s a protocol proposed by VB-Audio for transmitting data (audio/video/text) over a network. With it you can do all kinds of fantastic things. &#xA;&#xA;!--more--&#xA;&#xA;In order to fully utilise remote controlling over VBAN you need two way communication which requires implementing both TEXT (outgoing) and SERVICE (incoming) subprotocols. &#xA;&#xA;---&#xA;&#xA;TEXT&#xA;&#xA;Text is fairly straightforward in that you are required to build a packet comprised of a header matching the specification along with a payload and the VBAN server should process it. &#xA;&#xA;A barebones example:&#xA;&#xA;import socket&#xA;import struct&#xA;&#xA;fmt: off&#xA;BPSOPTS: list[int] = [&#xA;    0, 110, 150, 300, 600, 1200, 2400, 4800, 9600, 14400, 19200, 31250,&#xA;    38400, 57600, 115200, 128000, 230400, 250000, 256000, 460800, 921600,&#xA;    1000000, 1500000, 2000000, 3000000&#xA;]&#xA;fmt: on&#xA;SUBPROTOCOLTXT = 0x40&#xA;CHANNEL = 0&#xA;STREAMTYPEUTF8 = 0x10&#xA;&#xA;def main(&#xA;    command: str,&#xA;    host: str = &#34;localhost&#34;,&#xA;    port: int = 6980,&#xA;    streamname: str = &#34;Command1&#34;,&#xA;) -  None:&#xA;    with socket.socket(socket.AFINET, socket.SOCKDGRAM) as sock:&#xA;        header = struct.pack(&#xA;            &#34;&lt;4s4B16sI&#34;,&#xA;            b&#34;VBAN&#34;,&#xA;            BPSOPTS.index(256000) | SUBPROTOCOLTXT,&#xA;            0,&#xA;            CHANNEL,&#xA;            STREAMTYPEUTF8,&#xA;            streamname.encode(&#34;utf-8&#34;).ljust(16, b&#34;\0&#34;),&#xA;            0,&#xA;        )&#xA;&#xA;        sock.sendto(header + command.encode(&#34;utf-8&#34;), (host, port))&#xA;&#xA;---&#xA;&#xA;SERVICE&#xA;&#xA;Service is more involved in that you are required to:&#xA;&#xA;Subscribe to the service to receive the data&#xA;Parse the incoming data packets.&#xA;&#xA;For the first step we can fire a subscription packet matching the protocol specification but we must do this on an interval less than the time we subscribe for.&#xA;&#xA;import socket&#xA;import struct&#xA;import threading&#xA;import time&#xA;&#xA;SUBPROTOCOLSERVICE = 0x60&#xA;RTPACKETREGISTER = 32&#xA;SUBSCRIPTIONTIMEOUT = 5&#xA;PACKETIDENT = 0&#xA;&#xA;def subscribetoservice(&#xA;    sock: socket.socket, host: str, port: int, stopevent: threading.Event&#xA;):&#xA;    framecounter = 0&#xA;    while not stopevent.isset():&#xA;        header = struct.pack(&#xA;            &#34;&lt;4s4B16sI&#34;,&#xA;            b&#34;VBAN&#34;,&#xA;            SUBPROTOCOLSERVICE,&#xA;            PACKETIDENT &amp; 0xFF,&#xA;            RTPACKETREGISTER,&#xA;            SUBSCRIPTIONTIMEOUT &amp; 0xFF,&#xA;            b&#34;Register-RTP&#34;.ljust(16, b&#34;\0&#34;),&#xA;            framecounter,&#xA;        )&#xA;        framecounter += 1&#xA;&#xA;        sock.sendto(header, (host, port))&#xA;&#xA;        time.sleep(SUBSCRIPTIONTIMEOUT - 1)&#xA;&#xA;def main(&#xA;    host: str = &#34;localhost&#34;,&#xA;    port: int = 6980,&#xA;):&#xA;    stopevent = threading.Event()&#xA;    with socket.socket(socket.AFINET, socket.SOCKDGRAM) as sock:&#xA;        t = threading.Thread(&#xA;            target=subscribetoservice, args=(sock, host, port, stopevent)&#xA;        )&#xA;        t.start()&#xA;&#xA;        while not stopevent.isset():&#xA;            try:&#xA;                data, addr = sock.recvfrom(2048)&#xA;                print(f&#34;Received data from {addr}: {data}&#34;)&#xA;            except socket.timeout:&#xA;                continue&#xA;            except KeyboardInterrupt:&#xA;                stopevent.set()&#xA;&#xA;        t.join()&#xA;&#xA;What we&#39;ll receive in the output is a large dump of data:&#xA;&#xA;Received data from (&#39;localhost&#39;, 6980): b&#39;VBAN`\x00!\x00Voicemeeter-RTP\x00\x1f\x9e\t\x00\x03\x00\x00\x04\x02\x02\x01\x03\x00\x00\x00\x00\x80\xbb\x00\x00\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\x83\xfa\x83\xfa\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\x83\xfa\x83\xfa\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\x83\xfa\x83\xfa\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1...&#xA;&#xA;This isn&#39;t useful to us unless we parse and convert it to python types. Here is a simple class that parses bytes 28-43 of an incoming RT Packet:&#xA;&#xA;@dataclass&#xA;class VbanRTPacket:&#xA;    &#34;&#34;&#34;Represents bytes 28-43 of an incoming RTPacket&#34;&#34;&#34;&#xA;&#xA;    HEADERSIZE = 4 + 1 + 1 + 1 + 1 + 16&#xA;&#xA;    voicemeeterType: bytes&#xA;    reserved: bytes&#xA;    buffersize: bytes&#xA;    voicemeeterVersion: bytes&#xA;    optionBits: bytes&#xA;    samplerate: bytes&#xA;&#xA;    def str(self) -  str:&#xA;        return &#34;, &#34;.join(&#xA;            [&#xA;                f&#34;{self.voicemeetertype=}&#34;,&#xA;                f&#34;{self.voicemeeterversion=}&#34;,&#xA;                f&#34;{self.samplerate=}&#34;,&#xA;            ]&#xA;        )&#xA;&#xA;    @property&#xA;    def voicemeetertype(self) -  str:&#xA;        &#34;&#34;&#34;returns voicemeeter type as a string&#34;&#34;&#34;&#xA;        return [&#34;&#34;, &#34;basic&#34;, &#34;banana&#34;, &#34;potato&#34;][&#xA;            int.frombytes(self.voicemeeterType, &#34;little&#34;)&#xA;        ]&#xA;&#xA;    @property&#xA;    def voicemeeterversion(self) -  tuple:&#xA;        &#34;&#34;&#34;returns voicemeeter version as a tuple&#34;&#34;&#34;&#xA;        return tuple(self.voicemeeterVersion[i] for i in range(3, -1, -1))&#xA;&#xA;    @property&#xA;    def samplerate(self) -  int:&#xA;        &#34;&#34;&#34;returns samplerate as an int&#34;&#34;&#34;&#xA;        return int.frombytes(self.samplerate, &#34;little&#34;)&#xA;&#xA;    @classmethod&#xA;    def frombytes(cls, data: bytes) -  &#34;VbanRTPacket&#34;:&#xA;        &#34;&#34;&#34;Returns a dataclass representing the RTPacket data &#xA;        from bytes 28-43 of the incoming packet&#xA;        &#34;&#34;&#34;&#xA;        return cls(&#xA;            voicemeeterType=data[28:29],&#xA;            reserved=data[29:30],&#xA;            buffersize=data[30:32],&#xA;            voicemeeterVersion=data[32:36],&#xA;            optionBits=data[36:40],&#xA;            samplerate=data[40:44],&#xA;        )&#xA;&#xA;However, a VBAN server can throw a lot of different kinds of data to a listening socket so it&#39;s important to filter out the data you need. This can be done by adding in some guard clauses:&#xA;&#xA;def main(&#xA;    host: str = &#34;localhost&#34;,&#xA;    port: int = 6980,&#xA;):&#xA;    stopevent = threading.Event()&#xA;    with socket.socket(socket.AFINET, socket.SOCKDGRAM) as sock:&#xA;        t = threading.Thread(&#xA;            target=subscribetoservice, args=(sock, host, port, stopevent)&#xA;        )&#xA;        t.start()&#xA;&#xA;        while not stopevent.isset():&#xA;            try:&#xA;                data, addr = sock.recvfrom(2048)&#xA;                if len(data) &lt; VbanRTPacket.HEADERSIZE:&#xA;                    continue&#xA;&#xA;                if data[0:4] != b&#34;VBAN&#34;:&#xA;                    continue&#xA;&#xA;                protocol = data[4] &amp; 0xE0&#xA;                if protocol != SUBPROTOCOLSERVICE:&#xA;                    continue&#xA;&#xA;                if data[6] != RTPACKET:&#xA;                    continue&#xA;&#xA;                packet = VbanRTPacket.frombytes(data)&#xA;                print(packet)&#xA;&#xA;            except socket.timeout:&#xA;                continue&#xA;            except KeyboardInterrupt:&#xA;                stopevent.set()&#xA;&#xA;        t.join()&#xA;&#xA;The final output of the script is now:&#xA;&#xA;self.voicemeetertype=&#39;potato&#39;, self.voicemeeterversion=(3, 1, 2, 2), self.samplerate=48000&#xA;&#xA;Demonstrating how we can subscribe for real time data from the RTPacket service and convert the returned data into usable python types.&#xA;&#xA;---&#xA;&#xA;Conclusion&#xA;&#xA;There&#39;s a lot more to the specification than that which has been demonstrated in this blog post. You can find a more complete implementation of the TEXT/SERVICE subprotocols in the vban-cmd python package along with a python interface offering an abstraction layer over the dataclasses making scripts like the following possible:&#xA;&#xA;class ManyThings:&#xA;    def init(self, vban):&#xA;        self.vban = vban&#xA;&#xA;    def things(self):&#xA;        self.vban.strip[0].label = &#39;podmic&#39;&#xA;        self.vban.strip[0].mute = True&#xA;&#xA;    def otherthings(self):&#xA;        self.vban.bus[3].gain = -6.3&#xA;        self.vban.bus[4].eq = True&#xA;        info = (&#xA;            f&#39;bus 3 gain has been set to {self.vban.bus[3].gain}&#39;,&#xA;            f&#39;bus 4 eq has been set to {self.vban.bus[4].eq}&#39;,&#xA;        )&#xA;        print(&#39;\n&#39;.join(info))&#xA;&#xA;def main():&#xA;    with vbancmd.api(&#39;banana&#39;, ip=&#39;localhost&#39;, port=6980) as vban:&#xA;        do = ManyThings(vban)&#xA;        do.things()&#xA;        do.otherthings()&#xA;&#xA;        # set many parameters at once&#xA;        vban.apply(&#xA;            {&#xA;                &#39;strip-2&#39;: {&#39;A1&#39;: True, &#39;B1&#39;: True, &#39;gain&#39;: -6.0},&#xA;                &#39;bus-2&#39;: {&#39;mute&#39;: True},&#xA;                &#39;vban-in-0&#39;: {&#39;on&#39;: True},&#xA;            }&#xA;        )&#xA;&#xA;Or perhaps even include it in another package altogether, for example vban-cli, working entirely over VBAN. The possibilities are endless.&#xA;&#xA;---&#xA;&#xA;Other fantastic projects implementing various VBAN subprotocols: &#xA;&#xA;vban A pure C implementation of AUDIO/TEXT.&#xA;pyVBAN A python implementation of AUDIO/SERIAL and TEXT.&#xA;obs-vban An OBS plugin implementing AUDIO.&#xA;vbantxt A Go implementation of TEXT offering a single binary&#xA;&#xA;Even more with a quick search.&#xA;&#xA;Subscribe to this blog&#39;s RSS feed]]&gt;</description>
      <content:encoded><![CDATA[<p>If you&#39;re familiar with Voicemeeter then you&#39;ve probably heard of <a href="https://vb-audio.com/Voicemeeter/vban.htm">VBAN</a>. It&#39;s a protocol proposed by <a href="https://vb-audio.com/Cable/">VB-Audio</a> for transmitting data (audio/video/text) over a network. With it you can do all kinds of fantastic things.</p>



<p>In order to fully utilise remote controlling over VBAN you need two way communication which requires implementing both TEXT (outgoing) and SERVICE (incoming) subprotocols.</p>

<hr>

<h4 id="text" id="text">TEXT</h4>

<p>Text is fairly straightforward in that you are required to build a packet comprised of a header matching the specification along with a payload and the VBAN server should process it.</p>

<p>A barebones example:</p>

<pre><code class="language-python">import socket
import struct

# fmt: off
BPS_OPTS: list[int] = [
    0, 110, 150, 300, 600, 1200, 2400, 4800, 9600, 14400, 19200, 31250,
    38400, 57600, 115200, 128000, 230400, 250000, 256000, 460800, 921600,
    1000000, 1500000, 2000000, 3000000
]
# fmt: on
SUBPROTOCOL_TXT = 0x40
CHANNEL = 0
STREAMTYPE_UTF8 = 0x10


def main(
    command: str,
    host: str = &#34;localhost&#34;,
    port: int = 6980,
    streamname: str = &#34;Command1&#34;,
) -&gt; None:
    with socket.socket(socket.AF_INET, socket.SOCK_DGRAM) as sock:
        header = struct.pack(
            &#34;&lt;4s4B16sI&#34;,
            b&#34;VBAN&#34;,
            BPS_OPTS.index(256000) | SUBPROTOCOL_TXT,
            0,
            CHANNEL,
            STREAMTYPE_UTF8,
            streamname.encode(&#34;utf-8&#34;).ljust(16, b&#34;\0&#34;),
            0,
        )

        sock.sendto(header + command.encode(&#34;utf-8&#34;), (host, port))
</code></pre>

<hr>

<h4 id="service" id="service">SERVICE</h4>

<p>Service is more involved in that you are required to:</p>
<ul><li>Subscribe to the service to receive the data</li>
<li>Parse the incoming data packets.</li></ul>

<p>For the first step we can fire a subscription packet matching the protocol specification but we must do this on an interval less than the time we subscribe for.</p>

<pre><code class="language-python">import socket
import struct
import threading
import time

SUBPROTOCOL_SERVICE = 0x60
RTPACKETREGISTER = 32
SUBSCRIPTION_TIMEOUT = 5
PACKET_IDENT = 0


def subscribe_to_service(
    sock: socket.socket, host: str, port: int, stop_event: threading.Event
):
    framecounter = 0
    while not stop_event.is_set():
        header = struct.pack(
            &#34;&lt;4s4B16sI&#34;,
            b&#34;VBAN&#34;,
            SUBPROTOCOL_SERVICE,
            PACKET_IDENT &amp; 0xFF,
            RTPACKETREGISTER,
            SUBSCRIPTION_TIMEOUT &amp; 0xFF,
            b&#34;Register-RTP&#34;.ljust(16, b&#34;\0&#34;),
            framecounter,
        )
        framecounter += 1

        sock.sendto(header, (host, port))

        time.sleep(SUBSCRIPTION_TIMEOUT - 1)


def main(
    host: str = &#34;localhost&#34;,
    port: int = 6980,
):
    stop_event = threading.Event()
    with socket.socket(socket.AF_INET, socket.SOCK_DGRAM) as sock:
        t = threading.Thread(
            target=subscribe_to_service, args=(sock, host, port, stop_event)
        )
        t.start()

        while not stop_event.is_set():
            try:
                data, addr = sock.recvfrom(2048)
                print(f&#34;Received data from {addr}: {data}&#34;)
            except socket.timeout:
                continue
            except KeyboardInterrupt:
                stop_event.set()

        t.join()
</code></pre>

<p>What we&#39;ll receive in the output is a large dump of data:</p>

<pre><code class="language-console">Received data from (&#39;localhost&#39;, 6980): b&#39;VBAN`\x00!\x00Voicemeeter-RTP\x00\x1f\x9e\t\x00\x03\x00\x00\x04\x02\x02\x01\x03\x00\x00\x00\x00\x80\xbb\x00\x00\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\x83\xfa\x83\xfa\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\x83\xfa\x83\xfa\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\x83\xfa\x83\xfa\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1\xe0\xb1...
</code></pre>

<p>This isn&#39;t useful to us unless we parse and convert it to python types. Here is a simple class that parses bytes 28-43 of an incoming RT Packet:</p>

<pre><code class="language-python">@dataclass
class VbanRTPacket:
    &#34;&#34;&#34;Represents bytes 28-43 of an incoming RTPacket&#34;&#34;&#34;

    HEADER_SIZE = 4 + 1 + 1 + 1 + 1 + 16

    _voicemeeterType: bytes
    _reserved: bytes
    _buffersize: bytes
    _voicemeeterVersion: bytes
    _optionBits: bytes
    _samplerate: bytes

    def __str__(self) -&gt; str:
        return &#34;, &#34;.join(
            [
                f&#34;{self.voicemeetertype=}&#34;,
                f&#34;{self.voicemeeterversion=}&#34;,
                f&#34;{self.samplerate=}&#34;,
            ]
        )

    @property
    def voicemeetertype(self) -&gt; str:
        &#34;&#34;&#34;returns voicemeeter type as a string&#34;&#34;&#34;
        return [&#34;&#34;, &#34;basic&#34;, &#34;banana&#34;, &#34;potato&#34;][
            int.from_bytes(self._voicemeeterType, &#34;little&#34;)
        ]

    @property
    def voicemeeterversion(self) -&gt; tuple:
        &#34;&#34;&#34;returns voicemeeter version as a tuple&#34;&#34;&#34;
        return tuple(self._voicemeeterVersion[i] for i in range(3, -1, -1))

    @property
    def samplerate(self) -&gt; int:
        &#34;&#34;&#34;returns samplerate as an int&#34;&#34;&#34;
        return int.from_bytes(self._samplerate, &#34;little&#34;)

    @classmethod
    def from_bytes(cls, data: bytes) -&gt; &#34;VbanRTPacket&#34;:
        &#34;&#34;&#34;Returns a dataclass representing the RTPacket data 
        from bytes 28-43 of the incoming packet
        &#34;&#34;&#34;
        return cls(
            _voicemeeterType=data[28:29],
            _reserved=data[29:30],
            _buffersize=data[30:32],
            _voicemeeterVersion=data[32:36],
            _optionBits=data[36:40],
            _samplerate=data[40:44],
        )
</code></pre>

<p>However, a VBAN server can throw a lot of different kinds of data to a listening socket so it&#39;s important to filter out the data you need. This can be done by adding in some guard clauses:</p>

<pre><code class="language-python">def main(
    host: str = &#34;localhost&#34;,
    port: int = 6980,
):
    stop_event = threading.Event()
    with socket.socket(socket.AF_INET, socket.SOCK_DGRAM) as sock:
        t = threading.Thread(
            target=subscribe_to_service, args=(sock, host, port, stop_event)
        )
        t.start()

        while not stop_event.is_set():
            try:
                data, addr = sock.recvfrom(2048)
                if len(data) &lt; VbanRTPacket.HEADER_SIZE:
                    continue

                if data[0:4] != b&#34;VBAN&#34;:
                    continue

                protocol = data[4] &amp; 0xE0
                if protocol != SUBPROTOCOL_SERVICE:
                    continue

                if data[6] != RTPACKET:
                    continue

                packet = VbanRTPacket.from_bytes(data)
                print(packet)

            except socket.timeout:
                continue
            except KeyboardInterrupt:
                stop_event.set()

        t.join()
</code></pre>

<p>The final output of the script is now:</p>

<pre><code class="language-console">self.voicemeetertype=&#39;potato&#39;, self.voicemeeterversion=(3, 1, 2, 2), self.samplerate=48000
</code></pre>

<p>Demonstrating how we can subscribe for real time data from the RTPacket service and convert the returned data into usable python types.</p>

<hr>

<h4 id="conclusion" id="conclusion">Conclusion</h4>

<p>There&#39;s a lot more to the specification than that which has been demonstrated in this blog post. You can find a more complete implementation of the TEXT/SERVICE subprotocols in the <a href="https://git.onyxandiris.online/onyx_online/vban-cmd-python">vban-cmd</a> python package along with a python interface offering an abstraction layer over the dataclasses making scripts like the following possible:</p>

<pre><code class="language-python">class ManyThings:
    def __init__(self, vban):
        self.vban = vban

    def things(self):
        self.vban.strip[0].label = &#39;podmic&#39;
        self.vban.strip[0].mute = True

    def other_things(self):
        self.vban.bus[3].gain = -6.3
        self.vban.bus[4].eq = True
        info = (
            f&#39;bus 3 gain has been set to {self.vban.bus[3].gain}&#39;,
            f&#39;bus 4 eq has been set to {self.vban.bus[4].eq}&#39;,
        )
        print(&#39;\n&#39;.join(info))


def main():
    with vban_cmd.api(&#39;banana&#39;, ip=&#39;localhost&#39;, port=6980) as vban:
        do = ManyThings(vban)
        do.things()
        do.other_things()

        # set many parameters at once
        vban.apply(
            {
                &#39;strip-2&#39;: {&#39;A1&#39;: True, &#39;B1&#39;: True, &#39;gain&#39;: -6.0},
                &#39;bus-2&#39;: {&#39;mute&#39;: True},
                &#39;vban-in-0&#39;: {&#39;on&#39;: True},
            }
        )
</code></pre>

<p>Or perhaps even include it in another package altogether, for example <a href="https://git.onyxandiris.online/onyx_online/vban-cli">vban-cli</a>, working entirely over VBAN. The possibilities are endless.</p>

<hr>

<p>Other fantastic projects implementing various VBAN subprotocols:</p>
<ul><li><a href="https://github.com/quiniouben/vban">vban</a> A pure C implementation of AUDIO/TEXT.</li>
<li><a href="https://github.com/TheStaticTurtle/pyVBAN">pyVBAN</a> A python implementation of AUDIO/SERIAL and TEXT.</li>
<li><a href="https://github.com/norihiro/obs-vban">obs-vban</a> An OBS plugin implementing AUDIO.</li>
<li><a href="https://github.com/onyx-and-iris/vbantxt">vbantxt</a> A Go implementation of TEXT offering a single binary</li></ul>

<p><a href="https://github.com/search?q=vban&amp;type=repositories">Even more</a> with a quick search.</p>

<p>Subscribe to this blog&#39;s <a href="https://blog.onyxandiris.online/feed/">RSS feed</a></p>
]]></content:encoded>
      <guid>https://blog.onyxandiris.online/the-vban-text-service-subprotocols</guid>
      <pubDate>Mon, 16 Mar 2026 04:08:53 +0000</pubDate>
    </item>
    <item>
      <title>AmpliGame D6 and Streamlabs</title>
      <link>https://blog.onyxandiris.online/ampligame-d6-and-streamlabs</link>
      <description>&lt;![CDATA[I recently stumbled across a number of Reddit posts/Youtube comments asking how to switch scenes on Streamlabs using the Fifine AmpliGame D6 controller, so I decided to investigate...&#xA;&#xA;!--more--&#xA;&#xA;Ampligame D6&#xA;&#xA;Background&#xA;&#xA;It turns out the AmpliGame D6 does include a Streamlabs plugin:&#xA;&#xA;D6StreamlabsPlugin&#xA;&#xA;But you have to jump through hoops to get it to connect and even if you get it working (I had problems authorising), it&#39;s not obvious how to switch scenes with it, if it&#39;s even possible.&#xA;&#xA;---&#xA;&#xA;Solution&#xA;&#xA;So how else can we control Streamlabs from the AmpliGame D6? One option is via a CLI leveraging websockets but we need a way to run it. The D6 does include a feature Toolbox   Open which allows a user to load an application with arguments. This sounds great but in my testing, it opens a CMD window which threw me out of game, so no good. &#xA;&#xA;So how else can we run a CLI?&#xA;&#xA;---&#xA;&#xA;ScriptDeck&#xA;&#xA;Enter ScriptDeck, a wonderful plugin by the StartAutomating guys. It was created for the Elgato StreamDeck but can be installed on the AmpliGame D6 by following these instructions by TheBeardOfKnowledge. &#xA;&#xA;To summarise:&#xA;&#xA;Install Elgato StreamDeck software, register for a free account on their site&#xA;Download any plugins/icon packs you want&#xA;Copy and paste them into the D6 plugins directory.&#xA;Restart the D6 client.&#xA;&#xA;Note, at the time of writing this still works for the ScriptDeck plugin although it no longer works for many other plugins.&#xA;&#xA;When downloading the ScriptDeck plugin you&#39;ll notice there are two versions, one named WindowsScriptDeck and the other ScriptDeck. The former works with Powershell 5 and the latter with Powershell 7, so make sure you install the correct one. Windows by default has Powershell 5 installed but Powershell 7 is easy enough to install.&#xA;&#xA;Now that we have a way to run our CLI, we need a way to install the CLI and make it discoverable.&#xA;&#xA;---&#xA;&#xA;SLOBS-CLI&#xA;&#xA;slobs-cli is a python CLI, so you&#39;ll need python 3.11 or greater at the very least. Thankfully the guys at Astral have created uv a python package manager which makes installing python CLIs a cinch. There are several ways to install uv on Windows so select any one of them from their installation instructions. &#xA;&#xA;Once that&#39;s done, installing the slobs-cli is a one-liner:&#xA;&#xA;uv tool install slobs-cli&#xA;&#xA;In order to communicate with Streamlabs the slobs-cli expects to know the websocket connection information. In recent versions of Streamlabs you&#39;ll find the information in Settings   Mobile   Third Party Connections:&#xA;&#xA;Streamlabs Remote Conn&#xA;&#xA;Once you have the ip, port and websocket token it&#39;s time to test. First open a Powershell session in Windows and run:&#xA;&#xA;slobs-cli --domain localhost --port 59650 --token API token scene list&#xA;&#xA;You should be met with a response like:&#xA;&#xA;slobs-cli scene list&#xA;&#xA;This is great, it&#39;s working! However, passing the connection info on every invocation is cumbersome. A better way is to use environment variables. The way to manage this in Powershell is to store them in a Powershell profile, so in your Powershell session enter $profile and open the file (or create if it doesn&#39;t exist). Then in your Microsoft.PowerShellprofile.ps1 file enter the following:&#xA;&#xA;slobs-cli&#xA;$Env:SLOBSDOMAIN = &#34;localhost&#34;&#xA;$Env:SLOBSPORT = &#34;59650&#34;&#xA;$Env:SLOBSTOKEN = API Token&#xA;&#xA;To be sure this works, restart your Powershell session then retry the command above but without passing the connection flags:&#xA;&#xA;slobs-cli scene list&#xA;&#xA;---&#xA;&#xA;Running slobs-cli on the D6&#xA;&#xA;Now we have a way to run our CLI and we have the CLI installed, we just need to run it from the D6 controller!&#xA;&#xA;Expand the ScriptDeck plugin in the Fifine Control Deck software and drag-n-drop Powershell Script onto a button. Then in When Presssed just enter your slobs-cli command like so:&#xA;&#xA;D6Software-Slobs-CLI&#xA;&#xA;Fingers crossed, press the button on your D6 and watch the scene switch. Voilà!&#xA;&#xA;You can do a lot more than just scene switching with slobs-cli so check the README for a full list of available commands.&#xA;&#xA;---&#xA;&#xA;Conclusion&#xA;&#xA;If you&#39;re unfamiliar with using python or CLIs in general then this process might seem daunting but if you follow the steps carefully it shouldn&#39;t take long to set up.&#xA;&#xA;Furthermore, you can now control other software using CLIs, for example, OBS, Voicemeeter, Meld Studio etc.&#xA;&#xA;Further Notes:&#xA;The same process can be carried out for Meld Studio swapping out the slobs-cli part for meld-cli. However, it does require node instead of python and at the time of writing its still possible to port the Meld Studio plugin as detailed in the above section ScriptDeck.&#xA;&#xA;Subscribe to this blog&#39;s RSS feed]]&gt;</description>
      <content:encoded><![CDATA[<p>I recently stumbled across a number of Reddit posts/Youtube comments asking how to switch scenes on Streamlabs using the <a href="https://fifinemicrophone.com/products/fifine-ampligame-d6?srsltid=AfmBOopuYpykrlX_ZEO2qBDouWRgzO0NurMhACXSbKl6B6PNmGrnhpRY">Fifine AmpliGame D6</a> controller, so I decided to investigate...</p>



<p><img src="https://img.onyxandiris.online/api/photo/ampligamed6_Bm4vRDWK.jpg?token=1fVosuca" alt="Ampligame D6"></p>

<h4 id="background" id="background">Background</h4>

<p>It turns out the AmpliGame D6 does include a Streamlabs plugin:</p>

<p><img src="https://img.onyxandiris.online/api/photo/d6streamlabs_iY4lKT7U.png?token=uNl4W5vN" alt="D6StreamlabsPlugin"></p>

<p>But you have to <a href="https://youtu.be/z63OCroSRcY?si=NjCdJcvo_NtQWYwY&amp;t=19">jump through hoops</a> to get it to connect and even if you get it working (I had problems authorising), it&#39;s not obvious how to switch scenes with it, if it&#39;s even possible.</p>

<hr>

<h4 id="solution" id="solution">Solution</h4>

<p>So how else can we control Streamlabs from the AmpliGame D6? One option is via a CLI leveraging websockets but we need a way to run it. The D6 does include a feature Toolbox &gt; Open which allows a user to load an application with arguments. This sounds great but in my testing, it opens a CMD window which threw me out of game, so no good.</p>

<p>So how else can we run a CLI?</p>

<hr>

<h4 id="scriptdeck" id="scriptdeck">ScriptDeck</h4>

<p>Enter <a href="https://github.com/StartAutomating/ScriptDeck">ScriptDeck</a>, a wonderful plugin by the <a href="https://startautomating.com">StartAutomating</a> guys. It was created for the Elgato StreamDeck but can be installed on the AmpliGame D6 by following <a href="https://github.com/TheBeardofKnowledge/StreamDeck">these instructions</a> by TheBeardOfKnowledge.</p>

<p>To summarise:</p>
<ul><li>Install Elgato StreamDeck software, register for a free account on their site</li>
<li>Download any plugins/icon packs you want</li>
<li>Copy and paste them into the D6 plugins directory.</li>
<li>Restart the D6 client.</li></ul>

<p><em>Note, at the time of writing this still works for the ScriptDeck plugin although it no longer works for many other plugins.</em></p>

<p>When downloading the ScriptDeck plugin you&#39;ll notice there are two versions, one named <a href="https://marketplace.elgato.com/product/windows-scriptdeck-857f01dd-8fd4-44d5-8ec7-67ac850b21d3">WindowsScriptDeck</a> and the other <a href="https://marketplace.elgato.com/product/scriptdeck-927e59aa-b42d-4da7-84cc-8c78f4dd7e18?utm_source=pdp_related_v2">ScriptDeck</a>. The former works with Powershell 5 and the latter with Powershell 7, so make sure you install the correct one. Windows by default has Powershell 5 installed but Powershell 7 is <a href="https://learn.microsoft.com/en-us/shows/it-ops-talk/how-to-install-powershell-7">easy enough to install</a>.</p>

<p>Now that we have a way to run our CLI, we need a way to install the CLI and make it discoverable.</p>

<hr>

<h4 id="slobs-cli" id="slobs-cli">SLOBS-CLI</h4>

<p><a href="https://github.com/onyx-and-iris/slobs-cli">slobs-cli</a> is a python CLI, so you&#39;ll need python 3.11 or greater at the very least. Thankfully the guys at <a href="https://astral.sh">Astral</a> have created <a href="https://docs.astral.sh/uv/">uv</a> a python package manager which makes installing python CLIs a cinch. There are several ways to install uv on Windows so select any one of them from their <a href="https://docs.astral.sh/uv/getting-started/installation/#__tabbed_1_2">installation instructions</a>.</p>

<p>Once that&#39;s done, installing the slobs-cli is a one-liner:</p>

<p><code>uv tool install slobs-cli</code></p>

<p>In order to communicate with Streamlabs the slobs-cli expects to know the websocket connection information. In recent versions of Streamlabs you&#39;ll find the information in Settings &gt; Mobile &gt; Third Party Connections:</p>

<p><img src="https://img.onyxandiris.online/api/photo/streamlabs-remoteconn_tjuMkqib.png?token=QhPMR18u" alt="Streamlabs Remote Conn"></p>

<p>Once you have the ip, port and websocket token it&#39;s time to test. First open a Powershell session in Windows and run:</p>

<p><code>slobs-cli --domain localhost --port 59650 --token &lt;API token&gt; scene list</code></p>

<p>You should be met with a response like:</p>

<p><img src="https://img.onyxandiris.online/api/photo/slobs-cli-scenelist_OVPn3Vhk.png?token=xVrYGUdC" alt="slobs-cli scene list"></p>

<p>This is great, it&#39;s working! However, passing the connection info on every invocation is cumbersome. A better way is to use environment variables. The way to manage this in Powershell is to store them in a Powershell profile, so in your Powershell session enter <code>$profile</code> and open the file (or create if it doesn&#39;t exist). Then in your <code>Microsoft.PowerShell_profile.ps1</code> file enter the following:</p>

<pre><code class="language-powershell"># slobs-cli
$Env:SLOBS_DOMAIN = &#34;localhost&#34;
$Env:SLOBS_PORT = &#34;59650&#34;
$Env:SLOBS_TOKEN = &lt;API Token&gt;
</code></pre>

<p>To be sure this works, restart your Powershell session then retry the command above but without passing the connection flags:</p>

<p><code>slobs-cli scene list</code></p>

<hr>

<h4 id="running-slobs-cli-on-the-d6" id="running-slobs-cli-on-the-d6">Running slobs-cli on the D6</h4>

<p>Now we have a way to run our CLI and we have the CLI installed, we just need to run it from the D6 controller!</p>

<p>Expand the ScriptDeck plugin in the Fifine Control Deck software and drag-n-drop <em>Powershell Script</em> onto a button. Then in <em>When Presssed</em> just enter your slobs-cli command like so:</p>

<p><img src="https://img.onyxandiris.online/api/photo/d6software-slobs-cli_ovrGwuRR.png?token=nNdY64P9" alt="D6Software-Slobs-CLI"></p>

<p>Fingers crossed, press the button on your D6 and watch the scene switch. Voilà!</p>

<p>You can do a lot more than just scene switching with slobs-cli so check the <a href="https://github.com/onyx-and-iris/slobs-cli/blob/main/README.md#commands">README</a> for a full list of available commands.</p>

<hr>

<h4 id="conclusion" id="conclusion">Conclusion</h4>

<p>If you&#39;re unfamiliar with using python or CLIs in general then this process might seem daunting but if you follow the steps carefully it shouldn&#39;t take long to set up.</p>

<p>Furthermore, you can now control other software using CLIs, for example, OBS, Voicemeeter, Meld Studio etc.</p>

<p>Further Notes:
* The same process can be carried out for Meld Studio swapping out the slobs-cli part for <a href="https://github.com/onyx-and-iris/meld-cli">meld-cli</a>. However, it does require node instead of python and at the time of writing its still possible to port the Meld Studio plugin as detailed in the above section ScriptDeck.</p>

<p>Subscribe to this blog&#39;s <a href="https://blog.onyxandiris.online/feed/">RSS feed</a></p>
]]></content:encoded>
      <guid>https://blog.onyxandiris.online/ampligame-d6-and-streamlabs</guid>
      <pubDate>Mon, 29 Dec 2025 15:02:10 +0000</pubDate>
    </item>
    <item>
      <title>Learn Go with Pocket-Size Projects</title>
      <link>https://blog.onyxandiris.online/learn-go-with-pocket-size-projects</link>
      <description>&lt;![CDATA[by Aliénor Latour, Donia Chaiehloudj, and Pascal Bertrand&#xA;&#xA;In November of last year I decided to take part in Manning&#39;s manuscript review program as I was already reading one of their MEAP books, Learn Go with Pocket-Sized Projects. Here are my thoughts after reading the entire manuscript...&#xA;&#xA;!--more--&#xA;&#xA;Target Audience and Goals of the Book&#xA;&#xA;As outlined in the first chapter the book&#39;s target audience includes:&#xA;&#xA;Those with previous experience in other languages who would like to extend their professional skills.&#xA;Teams who are considering Go for their next project by providing a broad and thorough insight of the language.&#xA;Busy people who are looking to complete projects that are both rewarding but doable in a reasonable amount of time.&#xA;&#xA;The author&#39;s aim to achieve the following goals:&#xA;&#xA;Teach using an iterative process with each chapter split into sections that guide you through project implementations on a commit-by-commit basis.&#xA;Provide exemplary and clear examples of industry-level Go code.&#xA;Leave the reader inspired and with enough knowledge to go on to write excellent Go code themselves.&#xA;&#xA;---&#xA;&#xA;Structure&#xA;&#xA;The book is comprised of 12 chapters, each one walking the reader through a single project although the final chapter technically covers two subjects and is designed to round off the book.&#xA;&#xA;The kinds of projects you will develop include:&#xA;&#xA;A 3-level logging library with a stable, exported API.&#xA;A CLI money converter app&#xA;A concurrent maze solver&#xA;A gRCP web service and client.&#xA;&#xA;Each chapter follows a similar structure:&#xA;&#xA;Introductory text written in a narrative style often providing historical or cultural context to the problem at hand.&#xA;An outline of the projects Requirements and Limitations.&#xA;A step by step walkthrough of the project&#39;s implementation&#xA;A summary section detailing the salient points.&#xA;&#xA;I don&#39;t have enough space in this article to talk about every project in the book but I&#39;ll briefly cover two of them.&#xA;&#xA;---&#xA;&#xA;Bookworm&#39;s digest (chapter 3)&#xA;&#xA;If you&#39;re like me and you love reading books, then anything bookworm related is probably interesting to you! In this chapter you&#39;ll load virtual bookshelves stored in JSON files and use Go to read, sort and analyse the data.&#xA;&#xA;This chapter covers the following:&#xA;&#xA;Use of maps as an in-memory storage&#xA;Use of deferred functions and their stacked execution&#xA;Decoding JSON files into Go structs&#xA;Sorting slices with custom comparators as well as implementing sort.Interface{}&#xA;&#xA;Concurrent maze solver (chapter 9)&#xA;&#xA;This chapter introduces you to the world of maze solving.&#xA;&#xA;This chapter covers the following:&#xA;&#xA;Use of the image/png library to load and write mazes to disk.&#xA;How to spin up many goroutines to search paths simultaneously.&#xA;How to record the paths searched and colour them on the PNG.&#xA;How to synchronise goroutines using channels and waitgroups.&#xA;&#xA;maze&#xA;&#xA;---&#xA;&#xA;Do the authors achieve their goals?&#xA;&#xA;In my opinion the answer to this is a clear yes. &#xA;&#xA;Goal one: Iterative process&#xA;&#xA;Each chapter is split into sections with each section building on the previous. The chapters themselves are also sufficiently organised by difficulty.&#xA;&#xA;The authors take a bottom up approach, for each project they start with the basics from initializing the module, defining the project specification, thinking through the design choices and finally onto implementation.&#xA;&#xA;The reader is encouraged to commit their work to source control before any major additions.&#xA;&#xA;The authors go to great lengths to explain their thought processes. This is very important from the readers perspective because it allows you to think about &#39;why&#39; something should be done and not just &#39;how&#39;.&#xA;&#xA;Goal two: Production level code&#xA;&#xA;Testing is a primary concern throughout the book and encouraged at every step. &#xA;&#xA;The authors stress the importance of thinking through design choices carefully, not only their impact in the moment but also into the future.&#xA;&#xA;Although anti-patterns are used to demonstrate less than ideal code, the authors do a good job of explaining the shortcomings and then go on to demonstrate a preferred implementation.&#xA;&#xA;Goal three: Projects scaled for busy people&#xA;&#xA;As is addressed in the book&#39;s appendix, several of the chapters have you write projects that rely on in-memory databases primarily to keep the projects sized reasonably (doable in a day or two). The authors stated clearly both in the chapters and in the appendix that this is normally an unacceptable solution and would not be suitable for production code.&#xA;&#xA;---&#xA;&#xA;Conclusion&#xA;&#xA;I think that Learn Go with Pocket-Sized Projects is a beautifully written book that serves its stated goals.&#xA;&#xA;The book does a good job of presenting the strength and flexibility of Go, offering projects that deliver a broad coverage of the language.&#xA;&#xA;The writing style is a mixture of conversational, humorous and terse but for the most part explanations are straightforward and to the point.&#xA;&#xA;I wouldn&#39;t recommend this book as a first read in Go for a newcomer, better a second or third book. Certainly I would recommend it to anyone looking for a book for further their knowledge in Go best practices and project design.&#xA;&#xA;Further Notes:&#xA;&#xA;The downloadable code that accompanies the book was very well organised!&#xA;&#xA;Subscribe to this blog&#39;s RSS feed]]&gt;</description>
      <content:encoded><![CDATA[<h3 id="by-aliénor-latour-donia-chaiehloudj-and-pascal-bertrand" id="by-aliénor-latour-donia-chaiehloudj-and-pascal-bertrand">by Aliénor Latour, Donia Chaiehloudj, and Pascal Bertrand</h3>

<p>In November of last year I decided to take part in <a href="https://www.manning.com/manuscript-reviews">Manning&#39;s manuscript review program</a> as I was already reading one of their MEAP books, <a href="https://www.manning.com/books/learn-go-with-pocket-sized-projects">Learn Go with Pocket-Sized Projects</a>. Here are my thoughts after reading the entire manuscript...</p>



<h4 id="target-audience-and-goals-of-the-book" id="target-audience-and-goals-of-the-book">Target Audience and Goals of the Book</h4>

<p>As outlined in the first chapter the book&#39;s target audience includes:</p>
<ul><li>Those with previous experience in other languages who would like to extend their professional skills.</li>
<li>Teams who are considering Go for their next project by providing a broad and thorough insight of the language.</li>
<li>Busy people who are looking to complete projects that are both rewarding but doable in a reasonable amount of time.</li></ul>

<p>The author&#39;s aim to achieve the following goals:</p>
<ul><li>Teach using an iterative process with each chapter split into sections that guide you through project implementations on a commit-by-commit basis.</li>
<li>Provide exemplary and clear examples of industry-level Go code.</li>
<li>Leave the reader inspired and with enough knowledge to go on to write excellent Go code themselves.</li></ul>

<hr>

<h4 id="structure" id="structure">Structure</h4>

<p>The book is comprised of 12 chapters, each one walking the reader through a single project although the final chapter technically covers two subjects and is designed to round off the book.</p>

<p>The kinds of projects you will develop include:</p>
<ul><li>A 3-level logging library with a stable, exported API.</li>
<li>A CLI money converter app</li>
<li>A concurrent maze solver</li>
<li>A gRCP web service and client.</li></ul>

<p>Each chapter follows a similar structure:</p>
<ul><li>Introductory text written in a narrative style often providing historical or cultural context to the problem at hand.</li>
<li>An outline of the projects <em>Requirements</em> and <em>Limitations</em>.</li>
<li>A step by step walkthrough of the project&#39;s implementation</li>
<li>A summary section detailing the salient points.</li></ul>

<p>I don&#39;t have enough space in this article to talk about every project in the book but I&#39;ll briefly cover two of them.</p>

<hr>

<h4 id="bookworm-s-digest-chapter-3" id="bookworm-s-digest-chapter-3">Bookworm&#39;s digest (chapter 3)</h4>

<p>If you&#39;re like me and you love reading books, then anything <em>bookworm</em> related is probably interesting to you! In this chapter you&#39;ll load virtual bookshelves stored in JSON files and use Go to read, sort and analyse the data.</p>

<p>This chapter covers the following:</p>
<ul><li>Use of maps as an in-memory storage</li>
<li>Use of deferred functions and their stacked execution</li>
<li>Decoding JSON files into Go structs</li>
<li>Sorting slices with custom comparators as well as implementing <code>sort.Interface{}</code></li></ul>

<h4 id="concurrent-maze-solver-chapter-9" id="concurrent-maze-solver-chapter-9">Concurrent maze solver (chapter 9)</h4>

<p>This chapter introduces you to the world of maze solving.</p>

<p>This chapter covers the following:</p>
<ul><li>Use of the <code>image/png</code> library to load and write mazes to disk.</li>
<li>How to spin up many goroutines to search paths simultaneously.</li>
<li>How to record the paths searched and colour them on the PNG.</li>
<li>How to synchronise goroutines using channels and waitgroups.</li></ul>

<p><img src="https://img.onyxandiris.online/api/photo/maze_HCMjGfDQ.png?token=xQJUPW4o" alt="maze"></p>

<hr>

<h4 id="do-the-authors-achieve-their-goals" id="do-the-authors-achieve-their-goals">Do the authors achieve their goals?</h4>

<p>In my opinion the answer to this is a clear yes.</p>

<h5 id="goal-one-iterative-process" id="goal-one-iterative-process">Goal one: Iterative process</h5>

<p>Each chapter is split into sections with each section building on the previous. The chapters themselves are also sufficiently organised by difficulty.</p>

<p>The authors take a bottom up approach, for each project they start with the basics from initializing the module, defining the project specification, thinking through the design choices and finally onto implementation.</p>

<p>The reader is encouraged to commit their work to source control before any major additions.</p>

<p>The authors go to great lengths to explain their thought processes. This is very important from the readers perspective because it allows you to think about &#39;why&#39; something should be done and not just &#39;how&#39;.</p>

<h5 id="goal-two-production-level-code" id="goal-two-production-level-code">Goal two: Production level code</h5>

<p>Testing is a primary concern throughout the book and encouraged at every step.</p>

<p>The authors stress the importance of thinking through design choices carefully, not only their impact in the moment but also into the future.</p>

<p>Although anti-patterns are used to demonstrate less than ideal code, the authors do a good job of explaining the shortcomings and then go on to demonstrate a preferred implementation.</p>

<h5 id="goal-three-projects-scaled-for-busy-people" id="goal-three-projects-scaled-for-busy-people">Goal three: Projects scaled for busy people</h5>

<p>As is addressed in the book&#39;s appendix, several of the chapters have you write projects that rely on in-memory <em>databases</em> primarily to keep the projects sized reasonably (doable in a day or two). The authors stated clearly both in the chapters and in the appendix that this is normally an unacceptable solution and would <em>not</em> be suitable for production code.</p>

<hr>

<h4 id="conclusion" id="conclusion">Conclusion</h4>

<p>I think that Learn Go with Pocket-Sized Projects is a beautifully written book that serves its stated goals.</p>

<p>The book does a good job of presenting the strength and flexibility of Go, offering projects that deliver a broad coverage of the language.</p>

<p>The writing style is a mixture of conversational, humorous and terse but for the most part explanations are straightforward and to the point.</p>

<p>I wouldn&#39;t recommend this book as a first read in Go for a newcomer, better a second or third book. Certainly I would recommend it to anyone looking for a book for further their knowledge in Go best practices and project design.</p>

<p>Further Notes:</p>
<ul><li>The downloadable code that accompanies the book was very well organised!</li></ul>

<p>Subscribe to this blog&#39;s <a href="https://blog.onyxandiris.online/feed/">RSS feed</a></p>
]]></content:encoded>
      <guid>https://blog.onyxandiris.online/learn-go-with-pocket-size-projects</guid>
      <pubDate>Thu, 26 Jun 2025 17:08:12 +0000</pubDate>
    </item>
    <item>
      <title>Abstracting Away From a Base Class</title>
      <link>https://blog.onyxandiris.online/abstracting-away-from-a-base-class</link>
      <description>&lt;![CDATA[A few years ago I picked up a Midas MR18 Digital Mixer for mixing music and streaming online. After a while I discovered the need to make adjustments programmatically, however, there was no official API so I decided to investigate...&#xA;&#xA;!--more--&#xA;&#xA;Background&#xA;&#xA;Behringer offer an official app XAIR Edit that can be used to remote control a mixer from an android device or a Windows PC. It uses the OSC protocol over UDP to communicate with the mixer.&#xA;&#xA;Patrick-Gilles Maillot has done a lot of fantastic work (mostly in C) with the XAIR/MAIR series. Of particular help to me was the documentation he&#39;d drawn up along with an X32 Emulator which I found very useful in testing my own code.&#xA;&#xA;---&#xA;&#xA;Discovering the base class&#xA;&#xA;It turns out that one thing I really enjoy doing is writing interfaces that represent families of products. I&#39;ve spent a lot of time over the past few years programming with the Voicemeeter API, which itself is a family of products, so I thought why not give it a go here as well.&#xA;&#xA;To start with I went searching to see if anyone had already done this and I stumbled across the Xair-Remote python package by Peter Dikant. It&#39;s a useful package that allows a user to connect an X-TOUCH MINI MIDI Controller to an XAIR mixer. With it you can control parameter states including volumes, mute states and bus sends. Digging into the code a little I noticed he&#39;d written a base XAirClient class and used it&#39;s send() method to communicate directly with the mixer. So it occurred to me, perhaps we can decouple this base class from its implementation, write an abstraction layer over it, scale it according to each kind of mixer and present a pythonic interface that represents the XAIR/MAIR family of mixers.&#xA;&#xA;---&#xA;&#xA;Developing the Interface&#xA;&#xA;Step one, extract the base class&#xA;&#xA;This was mostly a copy-paste, I&#39;m very grateful to the original developer of the Xair-Remote package, it sped up the process of writing this interface.&#xA;&#xA;Now that we&#39;re dealing with an interface that represents a family of products it makes perfect sense to define the base class as an ABC, as it will serve as the launching point for the various mixer classes.&#xA;&#xA;Step two, lay out the kind maps&#xA;&#xA;Since we want our abstraction layer to scale correctly, it helps us to create dataclasses that map precisely the structure of each kind of mixer.&#xA;&#xA;For example, this kind map would represent the XR18 mixer:&#xA;&#xA;@dataclass(frozen=True)&#xA;class XR18KindMap(KindMap):&#xA;    # note ch 17-18 defined as aux return&#xA;    numdca: int = 4&#xA;    numstrip: int = 16&#xA;    numbus: int = 6&#xA;    numfx: int = 4&#xA;&#xA;Note, we expect the kind maps to remain frozen for the lifetime of the program, this way they behave more like named tuples.&#xA;&#xA;Step three, write the abstraction layer&#xA;&#xA;In writing the abstraction layer I relied heavily on documentation written up by others. I also had to rely somewhat on intuition and a lot of testing. Since I&#39;m not an audio engineer and I only have access to a single product in the family of products at points I just did my best.&#xA;&#xA;I&#39;ll take the Strip class as a single example. First, an abstract base class that defines some default implementation:&#xA;&#xA;class IStrip(abc.ABC):&#xA;    def init(self, remote, index: int):&#xA;        self.remote = remote&#xA;        self.index = index + 1&#xA;&#xA;    def getter(self, param: str) -  tuple:&#xA;        return self.remote.query(f&#34;{self.address}/{param}&#34;)&#xA;&#xA;    def setter(self, param: str, val: int):&#xA;        self.remote.send(f&#34;{self.address}/{param}&#34;, val)&#xA;&#xA;    @abc.abstractmethod&#xA;    def address(self):&#xA;        pass&#xA;&#xA;Then, a concrete class that mixes in a whole bunch of other classes that precisely define the layout for a single strip:&#xA;&#xA;class Strip(IStrip):&#xA;    @classmethod&#xA;    def make(cls, remote, index):&#xA;        STRIPcls = type(&#xA;            f&#34;Strip{remote.kind}&#34;,&#xA;            (cls,),&#xA;            {&#xA;                *{&#xA;                    cls.name.lower(): type(&#xA;                        f&#34;{cls.name}{remote.kind}&#34;, (cls, cls), {}&#xA;                    )(remote, index)&#xA;                    for cls in (&#xA;                        Config,&#xA;                        Preamp,&#xA;                        Gate,&#xA;                        ...&#xA;                    )&#xA;                },&#xA;                ...&#xA;            },&#xA;        )&#xA;        return STRIPcls(remote, index)&#xA;&#xA;    @property&#xA;    def address(self) -  str:&#xA;        return f&#34;/ch/{str(self.index).zfill(2)}&#34;&#xA;&#xA;Finally, a factory function for composing each XAirRemote{kind} object:&#xA;&#xA;def initxair(self, args, *kwargs):&#xA;    XAirRemote.init(self, args, *kwargs)&#xA;    self.kind = kind&#xA;    self.strip = tuple(Strip.make(self, i) for i in range(kind.numstrip))&#xA;&#xA;All of the classes are built and loaded into memory at import time ready to be requested by the package entry point.&#xA;&#xA;Extending the interface to support the X32&#xA;&#xA;When I first wrote the XAIR-API package I&#39;d originally intended to support the XAIR/MAIR series only. Some of the OSC addresses differ slightly for the X32 because it is (physically) a substantially different mixer. Whereas the XAIR/MAIR are digital rack mixers, the X32 is a full blown desk mixer with many more channels and physical controls. However, due to a particular request from a particular user of the interface I decided to investigate support for the X32. &#xA;&#xA;To that end I wrote some adapter classes, for example:&#xA;&#xA;class Bus(IBus):&#xA;    @property&#xA;    def address(self):&#xA;        return f&#34;/bus/{str(self.index).zfill(2)}&#34;&#xA;&#xA;They override the addresses for the XAIR series modifying them according to the X32 specification. In the case of Bus addresses, the XAIR series use /bus/1/ whereas the X32 uses /bus/01, as you can see numbers are left padded with zeros.&#xA;&#xA;Then I wrote a separate factory function for the x32, using the adapter classes to build the layout for the interface:&#xA;&#xA;def initx32(self, args, *kwargs):&#xA;    XAirRemote.init(self, args, **kwargs)&#xA;    self.kind = kind&#xA;    self.bus = tuple(adapter.Bus.make(self, i) for i in range(kind.numbus))&#xA;&#xA;---&#xA;&#xA;Conclusion&#xA;&#xA;All in all I found the exercise of decoupling a base class written by another developer and writing it to an interface an eye-opening experience. It forced me to really think about the following:&#xA;&#xA;The best way to implement the interface internally.&#xA;What it would be like to use from the consumer&#39;s perspective.&#xA;Which parts to expose.&#xA;How to present a pythonic interface that abstracts away from the details of OSC.&#xA;&#xA;I have made public the full source code.&#xA;&#xA;Subscribe to this blog&#39;s RSS feed]]&gt;</description>
      <content:encoded><![CDATA[<p>A few years ago I picked up a <a href="https://www.midasconsoles.com/product.html?modelCode=0605-AAF">Midas MR18 Digital Mixer</a> for mixing music and streaming online. After a while I discovered the need to make adjustments programmatically, however, there was no official API so I decided to investigate...</p>



<h4 id="background" id="background">Background</h4>

<p>Behringer offer an official app XAIR Edit that can be used to remote control a mixer from an android device or a Windows PC. It uses the OSC protocol over UDP to communicate with the mixer.</p>

<p><a href="https://sites.google.com/site/patrickmaillot/x32">Patrick-Gilles Maillot</a> has done a lot of fantastic work (mostly in C) with the XAIR/MAIR series. Of particular help to me was the documentation he&#39;d drawn up along with an <a href="https://sites.google.com/site/patrickmaillot/x32#h.p_rE4IH0Luimc0">X32 Emulator</a> which I found very useful in testing my own code.</p>

<hr>

<h4 id="discovering-the-base-class" id="discovering-the-base-class">Discovering the base class</h4>

<p>It turns out that one thing I really enjoy doing is writing interfaces that represent families of products. I&#39;ve spent a lot of time over the past few years programming with the Voicemeeter API, which itself is a family of products, so I thought why not give it a go here as well.</p>

<p>To start with I went searching to see if anyone had already done this and I stumbled across the <a href="https://github.com/peterdikant/xair-remote">Xair-Remote</a> python package by Peter Dikant. It&#39;s a useful package that allows a user to connect an <a href="https://www.behringer.com/product.html?modelCode=0808-AAF">X-TOUCH MINI MIDI Controller</a> to an XAIR mixer. With it you can control parameter states including volumes, mute states and bus sends. Digging into the code a little I noticed he&#39;d written a base <a href="https://github.com/peterdikant/xair-remote/blob/master/lib/xair.py">XAirClient</a> class and used it&#39;s <code>send()</code> method to communicate directly with the mixer. So it occurred to me, perhaps we can decouple this base class from its implementation, write an abstraction layer over it, scale it according to each kind of mixer and present a pythonic interface that represents the XAIR/MAIR family of mixers.</p>

<hr>

<h4 id="developing-the-interface" id="developing-the-interface">Developing the Interface</h4>

<h5 id="step-one-extract-the-base-class-https-git-onyxandiris-online-onyx-online-xair-api-python-src-branch-dev-xair-api-xair-py" id="step-one-extract-the-base-class-https-git-onyxandiris-online-onyx-online-xair-api-python-src-branch-dev-xair-api-xair-py">Step one, <a href="https://git.onyxandiris.online/onyx_online/xair-api-python/src/branch/dev/xair_api/xair.py">extract the base class</a></h5>

<p>This was mostly a copy-paste, I&#39;m very grateful to the original developer of the Xair-Remote package, it sped up the process of writing this interface.</p>

<p>Now that we&#39;re dealing with an interface that represents a family of products it makes perfect sense to define the base class as an ABC, as it will serve as the launching point for the various mixer classes.</p>

<h5 id="step-two-lay-out-the-kind-maps-https-git-onyxandiris-online-onyx-online-xair-api-python-src-branch-dev-xair-api-kinds-py" id="step-two-lay-out-the-kind-maps-https-git-onyxandiris-online-onyx-online-xair-api-python-src-branch-dev-xair-api-kinds-py">Step two, <a href="https://git.onyxandiris.online/onyx_online/xair-api-python/src/branch/dev/xair_api/kinds.py">lay out the kind maps</a></h5>

<p>Since we want our abstraction layer to scale correctly, it helps us to create dataclasses that map precisely the structure of each kind of mixer.</p>

<p>For example, this kind map would represent the XR18 mixer:</p>

<pre><code class="language-python">@dataclass(frozen=True)
class XR18KindMap(KindMap):
    # note ch 17-18 defined as aux return
    num_dca: int = 4
    num_strip: int = 16
    num_bus: int = 6
    num_fx: int = 4
</code></pre>

<p>Note, we expect the kind maps to remain frozen for the lifetime of the program, this way they behave more like named tuples.</p>

<h5 id="step-three-write-the-abstraction-layer-https-git-onyxandiris-online-onyx-online-xair-api-python-src-branch-dev-xair-api-shared-py" id="step-three-write-the-abstraction-layer-https-git-onyxandiris-online-onyx-online-xair-api-python-src-branch-dev-xair-api-shared-py">Step three, <a href="https://git.onyxandiris.online/onyx_online/xair-api-python/src/branch/dev/xair_api/shared.py">write the abstraction layer</a></h5>

<p>In writing the abstraction layer I relied heavily on documentation written up by others. I also had to rely somewhat on intuition and a lot of testing. Since I&#39;m not an audio engineer and I only have access to a single product in the family of products at points I just did my best.</p>

<p>I&#39;ll take the Strip class as a single example. First, an abstract base class that defines some default implementation:</p>

<pre><code class="language-python">class IStrip(abc.ABC):
    def __init__(self, remote, index: int):
        self._remote = remote
        self.index = index + 1

    def getter(self, param: str) -&gt; tuple:
        return self._remote.query(f&#34;{self.address}/{param}&#34;)

    def setter(self, param: str, val: int):
        self._remote.send(f&#34;{self.address}/{param}&#34;, val)

    @abc.abstractmethod
    def address(self):
        pass
</code></pre>

<p>Then, a concrete class that mixes in a whole bunch of other classes that precisely define the layout for a single strip:</p>

<pre><code class="language-python">class Strip(IStrip):
    @classmethod
    def make(cls, remote, index):
        STRIP_cls = type(
            f&#34;Strip{remote.kind}&#34;,
            (cls,),
            {
                **{
                    _cls.__name__.lower(): type(
                        f&#34;{_cls.__name__}{remote.kind}&#34;, (_cls, cls), {}
                    )(remote, index)
                    for _cls in (
                        Config,
                        Preamp,
                        Gate,
                        ...
                    )
                },
                ...
            },
        )
        return STRIP_cls(remote, index)

    @property
    def address(self) -&gt; str:
        return f&#34;/ch/{str(self.index).zfill(2)}&#34;

</code></pre>

<p>Finally, a factory function for composing each <code>XAirRemote{kind}</code> object:</p>

<pre><code class="language-python">def init_xair(self, *args, **kwargs):
    XAirRemote.__init__(self, *args, **kwargs)
    self.kind = kind
    self.strip = tuple(Strip.make(self, i) for i in range(kind.num_strip))
</code></pre>

<p>All of the classes are <a href="https://git.onyxandiris.online/onyx_online/xair-api-python/src/commit/669aba4cc422a29509ed42d6c5080f8f7188736e/xair_api/xair.py#L178">built and loaded into memory</a> at import time ready to be requested by the package entry point.</p>

<h5 id="extending-the-interface-to-support-the-x32-https-git-onyxandiris-online-onyx-online-xair-api-python-src-branch-dev-xair-api-adapter-py" id="extending-the-interface-to-support-the-x32-https-git-onyxandiris-online-onyx-online-xair-api-python-src-branch-dev-xair-api-adapter-py">Extending the interface to <a href="https://git.onyxandiris.online/onyx_online/xair-api-python/src/branch/dev/xair_api/adapter.py">support the X32</a></h5>

<p>When I first wrote the XAIR-API package I&#39;d originally intended to support the XAIR/MAIR series only. Some of the OSC addresses differ slightly for the X32 because it is (physically) a substantially different mixer. Whereas the XAIR/MAIR are digital rack mixers, the X32 is a full blown desk mixer with many more channels and physical controls. However, due to a particular request from a particular user of the interface I decided to investigate support for the X32.</p>

<p>To that end I wrote some adapter classes, for example:</p>

<pre><code class="language-python">class Bus(IBus):
    @property
    def address(self):
        return f&#34;/bus/{str(self.index).zfill(2)}&#34;
</code></pre>

<p>They override the addresses for the XAIR series modifying them according to the X32 specification. In the case of Bus addresses, the XAIR series use <code>/bus/1/</code> whereas the X32 uses <code>/bus/01</code>, as you can see numbers are left padded with zeros.</p>

<p>Then I wrote a separate factory function for the x32, using the adapter classes to build the layout for the interface:</p>

<pre><code class="language-python">def init_x32(self, *args, **kwargs):
    XAirRemote.__init__(self, *args, **kwargs)
    self.kind = kind
    self.bus = tuple(adapter.Bus.make(self, i) for i in range(kind.num_bus))
</code></pre>

<hr>

<h4 id="conclusion" id="conclusion">Conclusion</h4>

<p>All in all I found the exercise of decoupling a base class written by another developer and writing it to an interface an eye-opening experience. It forced me to really think about the following:</p>
<ul><li>The best way to implement the interface internally.</li>
<li>What it would be like to use from the consumer&#39;s perspective.</li>
<li>Which parts to expose.</li>
<li>How to present a pythonic interface that abstracts away from the details of OSC.</li></ul>

<p>I have made public the <a href="https://git.onyxandiris.online/onyx_online/xair-api-python">full source code</a>.</p>

<p>Subscribe to this blog&#39;s <a href="https://blog.onyxandiris.online/feed/">RSS feed</a></p>
]]></content:encoded>
      <guid>https://blog.onyxandiris.online/abstracting-away-from-a-base-class</guid>
      <pubDate>Thu, 17 Oct 2024 11:08:32 +0000</pubDate>
    </item>
    <item>
      <title>Head First C# Fifth Edition</title>
      <link>https://blog.onyxandiris.online/head-first-c-fifth-edition</link>
      <description>&lt;![CDATA[by Andrew Stellman &amp; Jennifer Greene&#xA;&#xA;Earlier this month I was lucky enough to get my hands on a complimentary copy of Head First C# Fifth Edition. There were things I expected and things that surprised me, here are my thoughts...&#xA;&#xA;!--more--&#xA;Front Cover&#xA;About the Review&#xA;&#xA;This review will be an opinion piece from the perspective of a learner. I&#39;m not a beginner in programming but I am new to C#. That puts me in the target audience for this book. I went into this primarily interested in GUI development. The scope of this review will be limited to the learning experience, for a more technical review you will need to look elsewhere.&#xA;&#xA;---&#xA;&#xA;Style&#xA;&#xA;It&#39;s impossible to write about my experience with a Head First book without discussing the general Head First style. Books in this series tend to employ a lot of visual aids, a conversational tone, a wide variety of exercises and deliberate repetition. For a more detailed explanation you may check this page on the O&#39;Reilly site.&#xA;&#xA;Since this is a review from my perspective I will give my personal opinion. There are aspects to this teaching style that I prefer over others. For example, the conversational tone suits me fine. Do I feel I emneed/em it to remain attentive? Not really, I&#39;m quite happy to read books written in a formal tone as well.&#xA;&#xA;The redundancy is a factor I consider to be a major strength, sure it&#39;s repetition but it&#39;s explanation of the same idea in emdifferent ways/em. Having said that, I didn&#39;t find it helpful every time and in fact on one or two occasions I found myself skipping an (already explained) point in order not to break my train of thought.&#xA;&#xA;I like the use of images to break up text. As you can see from the example page linked there is a clear structure to the page working its way from top to bottom, at the same time the reader&#39;s eye is drawn between locations. This page is a clear demonstration of the cognitive friendly approach this book aims for.&#xA;&#xA;On the topic of presentation, the one thing I noticed immediately upon opening the book was the use of hand-drawn characters. Historically, the Head First series always used photographed images of people, for an example of what I&#39;m talking about:&#xA;&#xA;img src=&#34;https://img.onyxandiris.online/api/photo/sidebyside2S9oWQ6oi.png?token=O2Tw7p5U&#34;&lt;/img&#xA;&#xA;This is a huge improvement in my opinion and makes the pages look cleaner, less distracting and more relatable. It&#39;s worth pointing out that this update appears to be series wide as I&#39;ve noticed the same in other latest edition Head First books.&#xA;&#xA;---&#xA;&#xA;Topics Covered&#xA;&#xA;The book has 12 main chapters and then intertwined with them six Unity Labs. Topics covered in this book, in no particular order, include but are not limited to:&#xA;&#xA;Good code style, ie sensible variable, method naming.&#xA;Refactoring.&#xA;Basic types and flow control constructs.&#xA;General OOP concepts: inheritance, composition, polymorphism, encapsulation, cohesion, separation of concerns, DRY etc.&#xA;Abstract and Concrete classes.&#xA;Using the Visual Studio debugger.&#xA;The encouragement and use of paper prototypes.&#xA;XAML and C# code behind in .NET MAUI apps.&#xA;Data binding.&#xA;Automatic properties with backing fields.&#xA;Interfaces (with and without default implementations), how to use them and why they are important.&#xA;Upcasting and downcasting.&#xA;Sorting techniques with IComparable and IComparer interfaces.&#xA;Collections, specifically covered: List, Dictionary, Queues, Stacks.&#xA;LINQ queries, LINQ methods and deferred evaluation.&#xA;File and memory streams. Network and Gzip streams are briefly mentioned but not explored.&#xA;IDisposable interface for handling the cleanup of unmanaged resources.&#xA;Object serialization.&#xA;The garbage collector.&#xA;Exception handling.&#xA;Nullable value types and the null-coalescing operator.&#xA;Extending sealed classes.&#xA;Unit testing with MSTest, writing for edge cases and unpredictable input.&#xA;Logging with Serilog&#xA;The encouragement and use of AI assisted learning and an introduction to prompt engineering.&#xA;Feedback loops, emergence and how they affect dynamics both in games and other areas of programming.&#xA;&#xA;As a learner, the parts I particularly liked&#xA;&#xA;The book opens with a clear and detailed walkthrough with screenshots guiding the reader on setting up their development environment. This is important because perhaps the reader has never used Visual Studio. I&#39;ve read other books before that lacked detailed setup instructions, this can lead to great confusion and leave a bad initial impression.&#xA;&#xA;The first project you&#39;re tasked with is an Animal Matching game. &#xA;&#xA;Animal Matching Game&#xA;&#xA;This is a great introduction to the book because it gives a fast demonstration of the power and flexibility of C# and XAML with just a few lines of code.&#xA;&#xA;The .NET MAUI project Random Cards walks you through a structured process. The discussion around Ana&#39;s game also covers point one.&#xA;&#xA;Plan in advance how you intend to model your ideas into classes, possibly with the aid of a paper prototype. &#xA;Write a working command line application. &#xA;Complete the project by using those same classes in a GUI application. All the while reinforcing the importance of accessibility. &#xA;&#xA;Random Cards&#xA;&#xA;I very much like this methodology, first of all because it encourages the learner to really think about what they are doing. As a beginner, it&#39;s way too easy to want to rush in and start writing code, then later regret some of the choices that a better design would have avoided. By putting careful thought and planning into how you organise your classes and which methods you will make public you can separate the internal design of a class from its consuming code.&#xA;Second, the process is incremental. It gives the learner the opportunity to start with the basics and build up as they go along.&#xA;Last of all, having personally spent time working with users with accessibility needs, I am intimately familiar with the frustrations that can arise from inaccessible GUI design. Computers are used by all kinds of users around the world and it&#39;s important that beginners are encouraged to follow best practices concerning accessibility.&#xA;&#xA;Each chapter ends with a Q&amp;A style section. As the authors state, some of these questions are actual questions they&#39;ve been asked by readers of past editions of the book. This is a great format for a few reasons:&#xA;&#xA;It gives the opportunity for readers to process topics covered in a back and forth manner&#xA;It encourages the reader to ask questions themselves&#xA;It may well answer a specific question they already had. The example that comes to mind personally, the IDisposable interface had been covered in chapter 10 but only in the context of files and streams. I found myself asking, is this an interface appropriately used with other types of classes? The Q&amp;A in chapter 12 asks and answers this question almost verbatim.&#xA;&#xA;Criticisms&#xA;&#xA;As I mentioned in the introduction of this review, my primary interest is in GUI development and there was plenty of that present. I am less interested in game development and there was a lot of that too. Now I have to be fair in my assessment, the authors go to a lot of effort to explain their reasoning behind the game-centred focus. The lessons taught throughout the book while building the game projects are broadly applicable across programming. Nevertheless, my question is simple, would it have been possible to teach some of those same programming concepts while focusing on a wider variety of non game-centred projects?&#xA;&#xA;Possible improvements&#xA;&#xA;The lumberjack exercise in chapter eight has you build a console app to demonstrate the Stack and Queue ADTs. It does a sufficient job but I have to wonder whether this was a great opportunity to demonstrate the nature of the stack and queue in a more visual, .NET MAUI GUI app.&#xA;&#xA;---&#xA;&#xA;Overall thoughts&#xA;&#xA;I believe Head First C# Fifth Edition is a very strong effort. It&#39;s clear to me that a lot of care and attention went into producing a complete and thorough learning experience for the reader. The emphasis on good programming practices, GUI accessibility and self learning techniques especially impressed me.&#xA;&#xA;The first several chapters are gently paced and introduce fundamental concepts in C# but also programming in general. If you prefer a fast paced introduction to a language you should bear this in mind.&#xA;&#xA;I&#39;ve talked about the general style of Head First books and given my personal opinion but each person&#39;s experience will vary. I will say this much, if you&#39;ve never tried reading a Head First book and C# piques your interest then I encourage you to give this book a go.&#xA;&#xA;Further Notes:&#xA;&#xA;The authors have made available on their Github repository Blazor versions of all the .NET MAUI apps.&#xA;They have a YouTube channel where they post guides related to the book.&#xA;&#xA;Subscribe to this blog&#39;s RSS feed]]&gt;</description>
      <content:encoded><![CDATA[<h3 id="by-andrew-stellman-jennifer-greene" id="by-andrew-stellman-jennifer-greene">by Andrew Stellman &amp; Jennifer Greene</h3>

<p>Earlier this month I was lucky enough to get my hands on a complimentary copy of <a href="https://www.oreilly.com/library/view/head-first-c/9781098141776/">Head First C# Fifth Edition</a>. There were things I expected and things that surprised me, here are my thoughts...</p>



<p><img src="https://img.onyxandiris.online/api/photo/headfirstcsharp4_Yq8qMlbY.jpg?token=bh2FvAtr" alt="Front Cover"></p>

<h4 id="about-the-review" id="about-the-review">About the Review</h4>

<p>This review will be an opinion piece from the perspective of a learner. I&#39;m not a beginner in programming but I am new to C#. That puts me in the target audience for this book. I went into this primarily interested in GUI development. The scope of this review will be limited to the learning experience, for a more technical review you will need to look elsewhere.</p>

<hr>

<h4 id="style" id="style">Style</h4>

<p>It&#39;s impossible to write about my experience with a Head First book without discussing the general Head First style. Books in this series tend to employ a lot of visual aids, a conversational tone, a wide variety of exercises and deliberate repetition. For a more detailed explanation you may check <a href="https://www.oreilly.com/library/view/head-first-c/9781098141776/preface04.html">this page on the O&#39;Reilly site</a>.</p>

<p>Since this is a review from my perspective I will give my personal opinion. There are aspects to this teaching style that I prefer over others. For example, the conversational tone suits me fine. Do I feel I <em>need</em> it to remain attentive? Not really, I&#39;m quite happy to read books written in a formal tone as well.</p>

<p>The redundancy is a factor I consider to be a major strength, sure it&#39;s repetition but it&#39;s explanation of the same idea in <em>different ways</em>. Having said that, I didn&#39;t find it helpful every time and in fact on one or two occasions I found myself skipping an (already explained) point in order not to break my train of thought.</p>

<p>I like the use of <a href="https://img.onyxandiris.online/api/photo/arrayofobjects_E1nwFcPl.png?token=xMQgmsaK">images to break up text</a>. As you can see from the example page linked there is a clear structure to the page working its way from top to bottom, at the same time the reader&#39;s eye is drawn between locations. This page is a clear demonstration of the cognitive friendly approach this book aims for.</p>

<p>On the topic of presentation, the one thing I noticed immediately upon opening the book was the use of hand-drawn characters. Historically, the Head First series always used photographed images of people, for an example of what I&#39;m talking about:</p>

<p><img src="https://img.onyxandiris.online/api/photo/sidebyside2_S9oWQ6oi.png?token=O2Tw7p5U"></p>

<p>This is a huge improvement in my opinion and makes the pages look cleaner, less distracting and more relatable. It&#39;s worth pointing out that this update appears to be series wide as I&#39;ve noticed the same in other latest edition Head First books.</p>

<hr>

<h4 id="topics-covered" id="topics-covered">Topics Covered</h4>

<p>The book has 12 main chapters and then intertwined with them six <a href="https://img.onyxandiris.online/api/photo/unitylab_pwG4wdQu.png?token=zAP2HTPt">Unity Labs</a>. Topics covered in this book, in no particular order, include but are not limited to:</p>
<ul><li>Good code style, ie sensible variable, method naming.</li>
<li>Refactoring.</li>
<li>Basic types and flow control constructs.</li>
<li>General OOP concepts: inheritance, composition, polymorphism, encapsulation, cohesion, separation of concerns, DRY etc.</li>
<li>Abstract and Concrete classes.</li>
<li>Using the Visual Studio debugger.</li>
<li>The encouragement and use of <a href="https://img.onyxandiris.online/api/photo/paperproto_StlMbzA7.png?token=xFCHFY8O">paper prototypes</a>.</li>
<li>XAML and C# code behind in .NET MAUI apps.</li>
<li>Data binding.</li>
<li>Automatic properties with backing fields.</li>
<li>Interfaces (with and without default implementations), how to use them and why they are important.</li>
<li>Upcasting and downcasting.</li>
<li>Sorting techniques with <code>IComparable</code> and <code>IComparer</code> interfaces.</li>
<li>Collections, specifically covered: List, Dictionary, Queues, Stacks.</li>
<li>LINQ queries, LINQ methods and deferred evaluation.</li>
<li>File and memory streams. Network and Gzip streams are briefly mentioned but not explored.</li>
<li><code>IDisposable</code> interface for handling the cleanup of unmanaged resources.</li>
<li>Object serialization.</li>
<li>The garbage collector.</li>
<li>Exception handling.</li>
<li>Nullable value types and the null-coalescing operator.</li>
<li>Extending sealed classes.</li>
<li>Unit testing with <a href="https://learn.microsoft.com/en-us/dotnet/core/testing/unit-testing-with-mstest">MSTest</a>, writing for edge cases and unpredictable input.</li>
<li>Logging with <a href="https://github.com/serilog/serilog">Serilog</a></li>
<li>The encouragement and use of AI assisted learning and an introduction to prompt engineering.</li>
<li>Feedback loops, emergence and how they affect dynamics both in games and other areas of programming.</li></ul>

<h4 id="as-a-learner-the-parts-i-particularly-liked" id="as-a-learner-the-parts-i-particularly-liked">As a learner, the parts I particularly liked</h4>

<p>The book opens with a clear and detailed <a href="https://img.onyxandiris.online/api/photo/installvs_0MeOFKG2.png?token=LIaybJOu">walkthrough with screenshots</a> guiding the reader on setting up their development environment. This is important because perhaps the reader has never used Visual Studio. I&#39;ve read other books before that lacked detailed setup instructions, this can lead to great confusion and leave a bad initial impression.</p>

<p>The first project you&#39;re tasked with is an Animal Matching game.</p>

<p><img src="https://img.onyxandiris.online/api/photo/animalmatching_GcZkSlw2.png?token=QhMgthjg" alt="Animal Matching Game"></p>

<p>This is a great introduction to the book because it gives a fast demonstration of the power and flexibility of C# and XAML with just a few lines of code.</p>

<p>The .NET MAUI project Random Cards walks you through a structured process. The discussion around Ana&#39;s game also covers point one.</p>
<ol><li>Plan in advance how you intend to model your ideas into classes, possibly with the aid of a paper prototype.</li>
<li>Write a working command line application.</li>
<li>Complete the project by using those same classes in a GUI application. All the while reinforcing the importance of accessibility.</li></ol>

<p><img src="https://img.onyxandiris.online/api/photo/randomcards_4mfeBCMf.png?token=Bn0gXNtU" alt="Random Cards"></p>

<p>I very much like this methodology, first of all because it encourages the learner to really think about what they are doing. As a beginner, it&#39;s way too easy to want to rush in and start writing code, then later regret some of the choices that a better design would have avoided. By putting careful thought and planning into how you organise your classes and which methods you will make public you can separate the internal design of a class from its consuming code.
Second, the process is incremental. It gives the learner the opportunity to start with the basics and build up as they go along.
Last of all, having personally spent time working with users with <a href="https://blog.onyxandiris.online/voicemeeter-accessibility-for-the-blind">accessibility needs</a>, I am intimately familiar with the frustrations that can arise from inaccessible GUI design. Computers are used by all kinds of users around the world and it&#39;s important that beginners are encouraged to follow best practices concerning accessibility.</p>

<p>Each chapter ends with a Q&amp;A style section. As the authors state, some of these questions are actual questions they&#39;ve been asked by readers of past editions of the book. This is a great format for a few reasons:</p>
<ul><li>It gives the opportunity for readers to process topics covered in a back and forth manner</li>
<li>It encourages the reader to ask questions themselves</li>
<li>It may well answer a specific question they already had. The example that comes to mind personally, the IDisposable interface had been covered in chapter 10 but only in the context of files and streams. I found myself asking, is this an interface appropriately used with other types of classes? The Q&amp;A in chapter 12 asks and answers this question almost verbatim.</li></ul>

<h4 id="criticisms" id="criticisms">Criticisms</h4>

<p>As I mentioned in the introduction of this review, my primary interest is in GUI development and there was plenty of that present. I am less interested in game development and there was a lot of that too. Now I have to be fair in my assessment, the authors go to a lot of effort to explain their reasoning behind the game-centred focus. The lessons taught throughout the book while building the game projects are broadly applicable across programming. Nevertheless, my question is simple, would it have been possible to teach some of those same programming concepts while focusing on a wider variety of non game-centred projects?</p>

<h4 id="possible-improvements" id="possible-improvements">Possible improvements</h4>

<p>The lumberjack exercise in chapter eight has you build a console app to demonstrate the Stack and Queue ADTs. It does a sufficient job but I have to wonder whether this was a great opportunity to demonstrate the nature of the stack and queue in a more visual, .NET MAUI GUI app.</p>

<hr>

<h4 id="overall-thoughts" id="overall-thoughts">Overall thoughts</h4>

<p>I believe Head First C# Fifth Edition is a very strong effort. It&#39;s clear to me that a lot of care and attention went into producing a complete and thorough learning experience for the reader. The emphasis on good programming practices, GUI accessibility and self learning techniques especially impressed me.</p>

<p>The first several chapters are gently paced and introduce fundamental concepts in C# but also programming in general. If you prefer a fast paced introduction to a language you should bear this in mind.</p>

<p>I&#39;ve talked about the general style of Head First books and given my personal opinion but each person&#39;s experience will vary. I will say this much, if you&#39;ve never tried reading a Head First book and C# piques your interest then I encourage you to give this book a go.</p>

<p>Further Notes:</p>
<ul><li>The authors have made available on their <a href="https://github.com/head-first-csharp/fifth-edition">Github repository</a> Blazor versions of all the .NET MAUI apps.</li>
<li>They have a <a href="https://www.youtube.com/@headfirstcsharp">YouTube channel</a> where they post guides related to the book.</li></ul>

<p>Subscribe to this blog&#39;s <a href="https://blog.onyxandiris.online/feed/">RSS feed</a></p>
]]></content:encoded>
      <guid>https://blog.onyxandiris.online/head-first-c-fifth-edition</guid>
      <pubDate>Fri, 30 Aug 2024 18:07:54 +0000</pubDate>
    </item>
    <item>
      <title>Programming with the IceOps Plugin API</title>
      <link>https://blog.onyxandiris.online/programming-with-the-iceops-plugin-api</link>
      <description>&lt;![CDATA[More than a decade ago now the guys of the IceOps-Team developed a server extension for the stock MW1 game offering a programmable plugin API. In this article I will detail the process of writing a plugin to prevent IP and URL advertisement in a game server.&#xA;&#xA;!--more--&#xA;&#xA;The plugin API offers a broad range of functions for managing cvars, interacting with registered commands, file operations and so on.&#xA;&#xA;As well as that, it offers callback functions for game events:&#xA;&#xA;PCL void OnPreFastRestart();&#xA;PCL void OnExitLevel();&#xA;PCL void OnPostFastRestart();&#xA;PCL void OnPreGameRestart(int savepersist);&#xA;PCL void OnPostGameRestart(int savepersist);&#xA;PCL void OnSpawnServer();&#xA;&#xA;---&#xA;&#xA;First things first, compile suitable regexes for later use:&#xA;&#xA;qboolean createregex()&#xA;{&#xA;    return regcomp(&amp;regex.ip, REGIP, REGEXTENDED) == 0 &amp;&amp;&#xA;           regcomp(&amp;regex.url, REGURL, REGEXTENDED) == 0;&#xA;}&#xA;We indicate to the server if they fail to compile in the OnInit() function:&#xA;&#xA;PCL int OnInit()&#xA;{&#xA;    if (createregex() == qfalse)&#xA;    {&#xA;        loginfo(&#34;AdStop: Failed to compile RegEx, exiting...&#34;);&#xA;        return 1;&#xA;    }&#xA;&#xA;    ...&#xA;}&#xA;&#xA;Then we register our plugin cvars, this gives users of the plugin the ability to configure its behaviour from config files:&#xA;&#xA;PCL int OnInit()&#xA;{&#xA;    ...&#xA;&#xA;    cvars.sub = PluginCvarRegisterString(..., ..., ...);&#xA;    cvars.subip = PluginCvarRegisterBool(..., ..., ...);&#xA;    cvars.suburl = PluginCvarRegisterBool(..., ..., ...);&#xA;&#xA;    return 0;&#xA;}&#xA;&#xA;---&#xA;&#xA;Before checking for regex matches we must first clean the incoming message of colour codes. If we don&#39;t do this coloured messages may fail a regex test.&#xA;&#xA;The following snippet uses pointer arithmetic to step through the string until we meet ^[0-9], skips those characters if present, otherwise the char at p is copied into the position at q. The message string is effectively cleaned in situ.&#xA;&#xA;ptrdifft removecolours(char message)&#xA;{&#xA;    char p, q;&#xA;&#xA;    p = q = message;&#xA;    while (p)&#xA;    {&#xA;        if (p == &#39;^&#39; &amp;&amp; (p + 1) &amp;&amp; isdigit((p + 1)))&#xA;        {&#xA;            p++;&#xA;        }&#xA;        else if (isprint(p))&#xA;        {&#xA;            q++ = p;&#xA;        }&#xA;        p++;&#xA;    }&#xA;    q = &#39;\0&#39;;&#xA;&#xA;    return q - message;&#xA;}&#xA;&#xA;Finally, if a match occurs we overwrite the original message. All of this is handled in the APIs OnMessageSent callback function:&#xA;&#xA;PCL void OnMessageSent(char msg, int slot, qboolean *show, int mode)&#xA;{&#xA;    removecolours(msg);&#xA;&#xA;    enum matchtype match = matches(msg);&#xA;    if (match == IP  || match == URL)&#xA;    {&#xA;        snprintf(message,&#xA;                 MAXSAYTEXT,&#xA;                 PluginCvarGetString(cvars.sub));&#xA;    }&#xA;&#xA;    ...&#xA;}&#xA;&#xA;---&#xA;&#xA;Once the plugin is loaded abusers attempting to advertise on the server will have their messages replaced.&#xA;&#xA;AdStop Plugin&#xA;&#xA;Note. if the replacement text is set to a blank string the users message will not show at all, not even a prompt indicating an attempted message.&#xA;&#xA;Subscribe to this blog&#39;s RSS feed]]&gt;</description>
      <content:encoded><![CDATA[<p>More than a decade ago now the guys of the IceOps-Team developed a server extension for the stock MW1 game offering a <a href="https://github.com/callofduty4x/CoD4x_Server/tree/master/plugins">programmable plugin API</a>. In this article I will detail the process of writing a plugin to prevent IP and URL advertisement in a game server.</p>



<p>The plugin API offers a broad range of functions for managing cvars, interacting with registered commands, file operations and so on.</p>

<p>As well as that, it offers callback functions for game events:</p>

<pre><code class="language-C">PCL void OnPreFastRestart();
PCL void OnExitLevel();
PCL void OnPostFastRestart();
PCL void OnPreGameRestart(int savepersist);
PCL void OnPostGameRestart(int savepersist);
PCL void OnSpawnServer();
</code></pre>

<hr>

<p>First things first, compile suitable regexes for later use:</p>

<pre><code class="language-C">qboolean create_regex()
{
    return regcomp(&amp;regex.ip, REG_IP, REG_EXTENDED) == 0 &amp;&amp;
           regcomp(&amp;regex.url, REG_URL, REG_EXTENDED) == 0;
}
</code></pre>

<p>We indicate to the server if they fail to compile in the <code>OnInit()</code> function:</p>

<pre><code class="language-C">PCL int OnInit()
{
    if (create_regex() == qfalse)
    {
        log_info(&#34;AdStop: Failed to compile RegEx, exiting...&#34;);
        return 1;
    }

    ...
}
</code></pre>

<p>Then we register our plugin cvars, this gives users of the plugin the ability to configure its behaviour from config files:</p>

<pre><code class="language-C">PCL int OnInit()
{
    ...

    cvars.sub = Plugin_Cvar_RegisterString(..., ..., ...);
    cvars.sub_ip = Plugin_Cvar_RegisterBool(..., ..., ...);
    cvars.sub_url = Plugin_Cvar_RegisterBool(..., ..., ...);

    return 0;
}
</code></pre>

<hr>

<p>Before checking for regex matches we must first clean the incoming message of colour codes. If we don&#39;t do this coloured messages may fail a regex test.</p>

<p>The following snippet uses pointer arithmetic to step through the string until we meet <code>^[0-9]</code>, skips those characters if present, otherwise the char at p is copied into the position at q. The message string is effectively cleaned in situ.</p>

<pre><code class="language-C">ptrdiff_t remove_colours(char *message)
{
    char *p, *q;

    p = q = message;
    while (*p)
    {
        if (*p == &#39;^&#39; &amp;&amp; *(p + 1) &amp;&amp; isdigit(*(p + 1)))
        {
            p++;
        }
        else if (isprint(*p))
        {
            *q++ = *p;
        }
        p++;
    }
    *q = &#39;\0&#39;;

    return q - message;
}
</code></pre>

<p>Finally, if a match occurs we overwrite the original message. All of this is handled in the APIs <code>OnMessageSent</code> callback function:</p>

<pre><code class="language-C">PCL void OnMessageSent(char *msg, int slot, qboolean *show, int mode)
{
    remove_colours(msg);

    enum match_type match = matches(msg);
    if (match == IP  || match == URL)
    {
        snprintf(message,
                 MAX_SAY_TEXT,
                 Plugin_Cvar_GetString(cvars.sub));
    }

    ...
}
</code></pre>

<hr>

<p>Once the plugin is loaded abusers attempting to advertise on the server will have their messages replaced.</p>

<p><img src="https://img.onyxandiris.online/api/photo/adstop_pVBP8G6G.png?token=5NyiBUhQ" alt="AdStop Plugin"></p>

<p>Note. if the <code>replacement text</code> is set to a blank string the users message will not show at all, not even a prompt indicating an attempted message.</p>

<p>Subscribe to this blog&#39;s <a href="https://blog.onyxandiris.online/feed/">RSS feed</a></p>
]]></content:encoded>
      <guid>https://blog.onyxandiris.online/programming-with-the-iceops-plugin-api</guid>
      <pubDate>Sun, 14 Jul 2024 18:36:51 +0000</pubDate>
    </item>
    <item>
      <title>Interfacing with Voicemeeter on the Command Line</title>
      <link>https://blog.onyxandiris.online/interfacing-with-voicemeeter-on-the-command-line</link>
      <description>&lt;![CDATA[I recently picked up the book C Programming A Modern Approach 2e by K. N. King. Although I&#39;ve dabbled with C over the years this is the first time I&#39;ve committed any period of time to learning it. I took this opportunity to do some programming with the Voicemeeter SDK while reading the book.&#xA;&#xA;!--more--&#xA;&#xA;My goal was to create a CLI program that implements the following features:&#xA;&#xA;Direct mode.&#xA;Interactive mode.&#xA;Possibility to set, get and toggle parameters.&#xA;Load commands from script files.&#xA;Launch the GUI (and possibly other tools) using flags.&#xA;Load configuration files (xml profiles).&#xA;Provide logging&#xA;&#xA;---&#xA;&#xA;First step, have the CLI accept instructions as arguments and execute each in turn. Then define an -i flag to enable interactive mode. This allows a user to operate the program in two distinct modes.&#xA;&#xA;    while ((opt = getopt(argc, argv, OPTSTR)) != -1)&#xA;    {&#xA;        switch (opt)&#xA;        {&#xA;        case &#39;i&#39;:&#xA;            iflag = true;&#xA;            break;&#xA;        case &#39;h&#39;:&#xA;            [[fallthrough]];&#xA;        default:&#xA;            usage();&#xA;        }&#xA;    }&#xA;&#xA;    if (iflag)&#xA;    {&#xA;        puts(&#34;Interactive mode enabled. Enter &#39;Q&#39; to exit.&#34;);&#xA;        interactive(vmr);&#xA;    }&#xA;    else&#xA;    {&#xA;        for (int i = optind; i &lt; argc; i++)&#xA;        {&#xA;            parseinput(vmr, argv[i]);&#xA;        }&#xA;    }&#xA;&#xA;Interactive mode should read repeatedly from stdin:&#xA;&#xA;void interactive(PTVMR vmr)&#xA;{&#xA;    ...&#xA;&#xA;    while (fgets(input, MAXLINE, stdin) != NULL)&#xA;    {&#xA;        input[(len = strcspn(input, &#34;\n&#34;))] = 0;&#xA;        if (len == 1 &amp;&amp; toupper(input[0]) == &#39;Q&#39;)&#xA;            break;&#xA;&#xA;        parseinput(vmr, input);&#xA;        &#xA;        ...&#xA;    }&#xA;}&#xA;&#xA;Allowing a user to enter commands until an exit instruction is given:&#xA;&#xA;Interactive Mode&#xA;&#xA;---&#xA;&#xA;Since reading from script files may use either of the CLI&#39;s modes we must parse the input in both cases. Consider that a single line in a script may contain multiple instructions, for example:&#xA;&#xA;strip[0].gain=5 strip[1].comp+=4.8 strip[2].label=podmic&#xA;&#xA;So it&#39;s important that we split lines into separate instructions:&#xA;&#xA;void parseinput(PTVMR vmr, char input)&#xA;{&#xA;    ...&#xA;&#xA;    token = strtokr(input, DELIMITERS, &amp;p);&#xA;    while (token != NULL)&#xA;    {&#xA;        parsecommand(vmr, token);&#xA;        token = strtokr(NULL, DELIMITERS, &amp;p);&#xA;    }&#xA;}&#xA;&#xA;Here is an example run with verbose output enabled:&#xA;&#xA;Script File Example&#xA;&#xA;Edit 16-03-2026&#xA;&#xA;After some real world testing of the vmrcli I realised that quoted strings which denote string parameters containing spaces were being delimited. Examples of where this may become problematic:&#xA;&#xA;strip[0].label=&#34;my podmic&#34;&#xA;bus[2].device.wdm=&#34;Realtek Digital Output (Realtek(R) Audio)&#34;&#xA;&#xA;The solution was to modify the parser to track when we step in and out of quoted strings, skip the delimeters when in quote and write the resulting token to a buffer, passing that to parsecommand() instead. &#xA;&#xA;---&#xA;&#xA;The CLI application should correctly handle get, set and toggle operations.&#xA;&#xA;void parsecommand(PTVMR vmr, char command)&#xA;{&#xA;    logdebug(&#34;Parsing %s&#34;, command);&#xA;&#xA;    if (command[0] == &#39;!&#39;) / toggle /&#xA;    {&#xA;        ...&#xA;&#xA;        return;&#xA;    }&#xA;&#xA;    if (strchr(command, &#39;=&#39;) != NULL) / set /&#xA;    {&#xA;        ...&#xA;    }&#xA;    else / get /&#xA;    {&#xA;        ...&#xA;    }&#xA;}&#xA;&#xA;Set is trivial enough, we can simply use the VBVMRSetParameters api call. This handles both float and string parameters.&#xA;&#xA;Get is a little tricker because C is statically typed, meaning the compiler must be made aware of parameter and return types. To avoid having to explicitly track which commands are expected to return which type of response we can use the fact that a failed getparameterfloat will return an error code and then try getparameterstring. &#xA;&#xA;void get(PTVMR vmr, char command, struct result res)&#xA;{&#xA;    cleardirty(vmr);&#xA;    if (getparameterfloat(vmr, command, &amp;res-  val.f) != 0)&#xA;    {&#xA;        res-  type = STRINGT;&#xA;        if (getparameterstring(vmr, command, res-  val.s) != 0)&#xA;        {&#xA;            res-  val.s[0] = 0;&#xA;            logerror(&#34;Unknown parameter &#39;%s&#39;&#34;, command);&#xA;        }&#xA;    }&#xA;}&#xA;&#xA;As well as that, C offers Unions for building mixed data structures (covered in Chapter 16 of C Programming A Modern Approach). By defining a Struct with a Union member I was able to store and track the result.&#xA;&#xA;struct result&#xA;{&#xA;    enum restype type;&#xA;    union val&#xA;    {&#xA;        float f;&#xA;        wchart s[RESSZ];&#xA;    } val;&#xA;};&#xA;&#xA;Toggle is then simply an implementation of a get into a set. The only noteworth detail is that we should guard against unsafe gain changes. I handled this by first testing if the response was of type float, and then testing it against 1 and 0. Strictly speaking this doesn&#39;t guarantee a boolean parameter, but it does protect against dangerous operations such as Strip 0 Gain = (1 - (-18)) which could be hazardous to health or audio equipment.&#xA;        if (res.type == FLOATT)&#xA;        {&#xA;            if (res.val.f == 1 || res.val.f == 0)&#xA;            {&#xA;                setparameterfloat(vmr, command, 1 - res.val.f);&#xA;            }&#xA;            else&#xA;            {&#xA;                ...&#xA;            }&#xA;        }&#xA;&#xA;---&#xA;&#xA;I decided to use the log.c package by rxi to offer various levels of logging. Here is a demonstration of the CLI run in direct mode with TRACE logging enabled.&#xA;&#xA;Trace Logging&#xA;&#xA;As you can see, it gives a low level perspective of the API calls.&#xA;---&#xA;&#xA;This has been a very fun project to tackle, it&#39;s easy to see why people fall in love with programming in C.&#xA;&#xA;I have made public the full source code for this package.&#xA;&#xA;Further Notes:&#xA;&#xA;The binary in Releases was compiled with coloured logging enabled. Unfortunately it doesn&#39;t work properly on all terminals. So rebuilding the application with coloured logging disabled may be necessary.&#xA;&#xA;Subscribe to this blog&#39;s RSS feed]]&gt;</description>
      <content:encoded><![CDATA[<p>I recently picked up the book <a href="https://www.ebooks.com/en-gb/book/210130787/c-programming/k-n-king/">C Programming A Modern Approach 2e by K. N. King</a>. Although I&#39;ve dabbled with C over the years this is the first time I&#39;ve committed any period of time to learning it. I took this opportunity to do some programming with the <a href="https://github.com/vburel2018/Voicemeeter-SDK">Voicemeeter SDK</a> while reading the book.</p>



<p>My goal was to create a CLI program that implements the following features:</p>
<ul><li>Direct mode.</li>
<li>Interactive mode.</li>
<li>Possibility to set, get and toggle parameters.</li>
<li>Load commands from script files.</li>
<li>Launch the GUI (and possibly other tools) using flags.</li>
<li>Load configuration files (xml profiles).</li>
<li>Provide logging</li></ul>

<hr>

<p>First step, have the CLI accept instructions as arguments and execute each in turn. Then define an <code>-i</code> flag to enable interactive mode. This allows a user to operate the program in two distinct modes.</p>

<pre><code class="language-C">    while ((opt = getopt(argc, argv, OPTSTR)) != -1)
    {
        switch (opt)
        {
        case &#39;i&#39;:
            iflag = true;
            break;
        case &#39;h&#39;:
            [[fallthrough]];
        default:
            usage();
        }
    }

    if (iflag)
    {
        puts(&#34;Interactive mode enabled. Enter &#39;Q&#39; to exit.&#34;);
        interactive(vmr);
    }
    else
    {
        for (int i = optind; i &lt; argc; i++)
        {
            parse_input(vmr, argv[i]);
        }
    }
</code></pre>

<p>Interactive mode should read repeatedly from stdin:</p>

<pre><code class="language-C">void interactive(PT_VMR vmr)
{
    ...

    while (fgets(input, MAX_LINE, stdin) != NULL)
    {
        input[(len = strcspn(input, &#34;\n&#34;))] = 0;
        if (len == 1 &amp;&amp; toupper(input[0]) == &#39;Q&#39;)
            break;

        parse_input(vmr, input);
        
        ...
    }
}
</code></pre>

<p>Allowing a user to enter commands until an exit instruction is given:</p>

<p><img src="https://img.onyxandiris.online/api/photo/loglevel3_zQBuWxTp.png?token=IoGvRC9t" alt="Interactive Mode"></p>

<hr>

<p>Since reading from script files <a href="https://git.onyxandiris.online/onyx_online/vmrcli#script-files">may use either of the CLI&#39;s modes</a> we must parse the input in both cases. Consider that a single line in a script may contain multiple instructions, for example:</p>

<p><code>strip[0].gain=5 strip[1].comp+=4.8 strip[2].label=podmic</code></p>

<p>So it&#39;s important that we split lines into separate instructions:</p>

<pre><code class="language-C">void parse_input(PT_VMR vmr, char *input)
{
    ...

    token = strtok_r(input, DELIMITERS, &amp;p);
    while (token != NULL)
    {
        parse_command(vmr, token);
        token = strtok_r(NULL, DELIMITERS, &amp;p);
    }
}
</code></pre>

<p>Here is an example run with verbose output enabled:</p>

<p><img src="https://img.onyxandiris.online/api/photo/loglevel2_x5cSkMHx.png?token=c6EpSxLm" alt="Script File Example"></p>

<p><em>Edit 16-03-2026</em></p>

<p>After some real world testing of the vmrcli I realised that quoted strings which denote string parameters containing spaces were being delimited. Examples of where this may become problematic:</p>
<ul><li><code>strip[0].label=&#34;my podmic&#34;</code></li>
<li><code>bus[2].device.wdm=&#34;Realtek Digital Output (Realtek(R) Audio)&#34;</code></li></ul>

<p>The solution was to <a href="https://git.onyxandiris.online/onyx_online/vmrcli/src/commit/3823e0c49799dc09d607d2851deffce2848abdc6/src/vmrcli.c#L332">modify the parser</a> to track when we step in and out of quoted strings, skip the delimeters when <em>in quote</em> and write the resulting token to a buffer, passing that to <code>parse_command()</code> instead.</p>

<hr>

<p>The CLI application should correctly handle get, set and toggle operations.</p>

<pre><code class="language-C">void parse_command(PT_VMR vmr, char *command)
{
    log_debug(&#34;Parsing %s&#34;, command);

    if (command[0] == &#39;!&#39;) /* toggle */
    {
        ...

        return;
    }

    if (strchr(command, &#39;=&#39;) != NULL) /* set */
    {
        ...
    }
    else /* get */
    {
        ...
    }
}
</code></pre>

<p>Set is trivial enough, we can simply use the <code>VBVMR_SetParameters</code> api call. This handles both float and string parameters.</p>

<p>Get is a little tricker because C is statically typed, meaning the compiler must be made aware of parameter and return types. To avoid having to explicitly track which commands are expected to return which type of response we can use the fact that a failed <code>get_parameter_float</code> will return an error code and then try <code>get_parameter_string</code>.</p>

<pre><code class="language-C">void get(PT_VMR vmr, char *command, struct result *res)
{
    clear_dirty(vmr);
    if (get_parameter_float(vmr, command, &amp;res-&gt;val.f) != 0)
    {
        res-&gt;type = STRING_T;
        if (get_parameter_string(vmr, command, res-&gt;val.s) != 0)
        {
            res-&gt;val.s[0] = 0;
            log_error(&#34;Unknown parameter &#39;%s&#39;&#34;, command);
        }
    }
}
</code></pre>

<p>As well as that, C offers Unions for building mixed data structures (covered in Chapter 16 of C Programming A Modern Approach). By defining a Struct with a Union member I was able to store and track the result.</p>

<pre><code class="language-C">struct result
{
    enum restype type;
    union val
    {
        float f;
        wchar_t s[RES_SZ];
    } val;
};
</code></pre>

<p>Toggle is then simply an implementation of a get into a set. The only noteworth detail is that we should guard against unsafe gain changes. I handled this by first testing if the response was of type float, and then testing it against 1 and 0. Strictly speaking this doesn&#39;t guarantee a boolean parameter, but it does protect against dangerous operations such as <code>Strip 0 Gain = (1 - (-18))</code> which could be hazardous to health or audio equipment.</p>

<pre><code class="language-C">        if (res.type == FLOAT_T)
        {
            if (res.val.f == 1 || res.val.f == 0)
            {
                set_parameter_float(vmr, command, 1 - res.val.f);
            }
            else
            {
                ...
            }
        }
</code></pre>

<hr>

<p>I decided to use the <a href="https://github.com/rxi/log.c">log.c package by rxi</a> to offer various levels of logging. Here is a demonstration of the CLI run in direct mode with TRACE logging enabled.</p>

<p><img src="https://img.onyxandiris.online/api/photo/loglevel0_iyTRqUQq.png?token=DbqhBwRV" alt="Trace Logging"></p>

<p>As you can see, it gives a low level perspective of the API calls.</p>

<hr>

<p>This has been a very fun project to tackle, it&#39;s easy to see why people fall in love with programming in C.</p>

<p>I have made public the <a href="https://git.onyxandiris.online/onyx_online/vmrcli">full source code</a> for this package.</p>

<p>Further Notes:</p>
<ul><li>The binary in Releases was compiled with coloured logging enabled. Unfortunately it doesn&#39;t work properly on all terminals. So rebuilding the application with coloured logging disabled may be necessary.</li></ul>

<p>Subscribe to this blog&#39;s <a href="https://blog.onyxandiris.online/feed/">RSS feed</a></p>
]]></content:encoded>
      <guid>https://blog.onyxandiris.online/interfacing-with-voicemeeter-on-the-command-line</guid>
      <pubDate>Sat, 06 Jul 2024 09:56:28 +0000</pubDate>
    </item>
    <item>
      <title>Relaying Github Webhooks</title>
      <link>https://blog.onyxandiris.online/relaying-github-webhooks</link>
      <description>&lt;![CDATA[Programming is my primary interest but Linux server administation is another. I find webhooks a valuable way of keeping up to date and more recently a convenient way to receive updates from Github.&#xA;&#xA;!--more--&#xA;&#xA;For this purpose I decided a centralized system would benefit me. First of all it means I can configure all webhooks from one location. As well as this, it allows me to parse and possibly modify a webhooks payload before passing it onto Discord.&#xA;&#xA;To give an example of what I mean, there&#39;s an awesome backup utility for Linux called GoBackup. It includes the ability to notify, by webhook, completion status to various platforms including Discord. However, the default webhook message resembles:&#xA;&#xA;systemd: [GoBackup] OK: Backup service has successfully&#xA;&#xA;Backup of service completed successfully at 2023-10-23 09:17:47.443308376 +0100 BST&#xA;&#xA;Which is fine, it gives plenty of detail and may be all one needs. However, I prefer to use notifications that take the form:&#xA;&#xA;hostname :: Service: status message&#xA;&#xA;With the origin of the webhook (server hostname), followed by the Linux service name and finally the status message. I already get an idea of the time from Discord and if I need further information I check logs.&#xA;&#xA;hr&#xA;&#xA;Defining the route and handler&#xA;&#xA;I chose Flask for this task. First step, define the endpoint:&#xA;@app.route(&#34;/github-payload&#34;, methods=[&#34;POST&#34;])&#xA;def githubpayload():&#xA;    ...&#xA;&#xA;Then we&#39;ll need to parse the event type using the X-Github-Event header:&#xA;&#xA;    payload = request.getjson()&#xA;    match eventtype := request.headers.get(&#34;X-GitHub-Event&#34;):&#xA;        case &#34;issues&#34;:&#xA;            issueshandler(payload)&#xA;    return (&#34;&#34;, 200, None)&#xA;&#xA;And finally define a handler for this event type:&#xA;def issueshandler(payload):&#xA;    embed = {&#xA;        ...&#xA;    }&#xA;    data = {&#xA;        &#34;username&#34;: &#34;Github-OpenGist&#34;,&#xA;        &#34;embeds&#34;: [embed],&#xA;    }&#xA;    senddiscordwebhook(app.config[&#34;DISCORDWEBHOOK&#34;], data)&#xA;&#xA;def senddiscordwebhook(webhookurl, data):&#xA;    requests.post(webhookurl, json=data)&#xA;&#xA;The end result is a webhook message that closely resembles a webhook message directly from Github.&#xA;&#xA;hr&#xA;&#xA;Verifying the request&#xA;&#xA;Since I intend to sit this Flask server behind a reverse proxy I took it one step further. By fowarding the IP of the request we can check the requests ip against Github&#39;s Hook servers with the endpoint https://api.github.com/meta:&#xA;&#xA;def verifysrcip(srcip):&#xA;    allowedips = requests.get(&#34;https://api.github.com/meta&#34;).json()[&#34;hooks&#34;]&#xA;    return any(srcip in ipnetwork(validip) for validip in allowedips)&#xA;&#xA;Github also allows us to associate a secret with the webhook which we can verify like so:&#xA;def verifyhmachash(data, signature):&#xA;    githubsecret = bytes(app.config[&#34;GITHUBSECRET&#34;], &#34;ascii&#34;)&#xA;    mac = hmac.new(githubsecret, msg=data, digestmod=hashlib.sha1)&#xA;    return hmac.comparedigest(&#34;sha1=&#34; + mac.hexdigest(), signature)&#xA;&#xA;hr&#xA;&#xA;Conclusion&#xA;&#xA;With everything up and running I can configure/modify and receive webhook notification through a proxy server.&#xA;&#xA;I&#39;ve posted a partial implementation of the code, only Github Issues are defined but it can be extended to handle other event types. In my case I&#39;ve added more routes for linux services.&#xA;&#xA;Notes about the gist:&#xA;&#xA;It assumes configuration in a .env file located at the root of the server&#xA;If sat behind a reverse proxy it checks for the requests real ip using the header X-Real-IP so if you&#39;re using Apache you may need to alter this.&#xA;It&#39;s running in debug mode, not suitable for a live environment.&#xA;&#xA;Subscribe to this blog&#39;s RSS feed]]&gt;</description>
      <content:encoded><![CDATA[<p>Programming is my primary interest but Linux server administation is another. I find webhooks a valuable way of keeping up to date and more recently a convenient way to receive updates from Github.</p>



<p>For this purpose I decided a centralized system would benefit me. First of all it means I can configure all webhooks from one location. As well as this, it allows me to parse and possibly modify a webhooks payload before passing it onto Discord.</p>

<p>To give an example of what I mean, there&#39;s an awesome backup utility for Linux called <a href="https://github.com/gobackup/gobackup">GoBackup</a>. It includes the ability to notify, by webhook, completion status to various platforms including Discord. However, the default webhook message resembles:</p>

<pre><code class="language-code">systemd: [GoBackup] OK: Backup service has successfully

Backup of service completed successfully at 2023-10-23 09:17:47.443308376 +0100 BST
</code></pre>

<p>Which is fine, it gives plenty of detail and may be all one needs. However, I prefer to use notifications that take the form:</p>

<pre><code class="language-code">&lt;hostname&gt; :: &lt;Service&gt;: &lt;status message&gt;
</code></pre>

<p>With the origin of the webhook (server hostname), followed by the Linux service name and finally the status message. I already get an idea of the time from Discord and if I need further information I check logs.</p>

<hr>

<h4 id="defining-the-route-and-handler" id="defining-the-route-and-handler">Defining the route and handler</h4>

<p>I chose Flask for this task. First step, define the endpoint:</p>

<pre><code class="language-python">@app.route(&#34;/github-payload&#34;, methods=[&#34;POST&#34;])
def github_payload():
    ...
</code></pre>

<p>Then we&#39;ll need to parse the event type using the <code>X-Github-Event</code> header:</p>

<pre><code class="language-python">    payload = request.get_json()
    match event_type := request.headers.get(&#34;X-GitHub-Event&#34;):
        case &#34;issues&#34;:
            issues_handler(payload)
    return (&#34;&#34;, 200, None)
</code></pre>

<p>And finally define a handler for this event type:</p>

<pre><code class="language-python">def issues_handler(payload):
    embed = {
        ...
    }
    data = {
        &#34;username&#34;: &#34;Github-OpenGist&#34;,
        &#34;embeds&#34;: [embed],
    }
    send_discord_webhook(app.config[&#34;DISCORD_WEBHOOK&#34;], data)

def send_discord_webhook(webhook_url, data):
    requests.post(webhook_url, json=data)
</code></pre>

<p>The end result is a webhook message that closely resembles a webhook message directly from Github.</p>

<hr>

<h4 id="verifying-the-request" id="verifying-the-request">Verifying the request</h4>

<p>Since I intend to sit this Flask server behind a reverse proxy I took it one step further. By <a href="https://gist.github.com/patrocle/43f688e8cfef1a48c66f22825e9e0678">fowarding the IP of the request</a> we can check the requests ip against Github&#39;s Hook servers with the endpoint <code>https://api.github.com/meta</code>:</p>

<pre><code class="language-python">def verify_src_ip(src_ip):
    allowed_ips = requests.get(&#34;https://api.github.com/meta&#34;).json()[&#34;hooks&#34;]
    return any(src_ip in ip_network(valid_ip) for valid_ip in allowed_ips)
</code></pre>

<p>Github also allows us to associate a secret with the webhook which we can verify like so:</p>

<pre><code class="language-python">def verify_hmac_hash(data, signature):
    github_secret = bytes(app.config[&#34;GITHUB_SECRET&#34;], &#34;ascii&#34;)
    mac = hmac.new(github_secret, msg=data, digestmod=hashlib.sha1)
    return hmac.compare_digest(&#34;sha1=&#34; + mac.hexdigest(), signature)
</code></pre>

<hr>

<h4 id="conclusion" id="conclusion">Conclusion</h4>

<p>With everything up and running I can configure/modify and receive webhook notification through a proxy server.</p>

<p>I&#39;ve posted a <a href="https://gist.onyxandiris.online/onyx_online/0ed9cb3783e94e37a5b82d71380f32e8">partial implementation</a> of the code, only Github Issues are defined but it can be extended to handle other event types. In my case I&#39;ve added more routes for linux services.</p>

<p>Notes about the gist:</p>
<ul><li>It assumes configuration in a <code>.env</code> file located at the root of the server</li>
<li>If sat behind a reverse proxy it checks for the requests real ip using the header <code>X-Real-IP</code> so if you&#39;re using Apache you may need to alter this.</li>
<li>It&#39;s running in <a href="https://flask.palletsprojects.com/en/3.0.x/debugging/">debug mode</a>, not suitable for a live environment.</li></ul>

<p>Subscribe to this blog&#39;s <a href="https://blog.onyxandiris.online/feed/">RSS feed</a></p>
]]></content:encoded>
      <guid>https://blog.onyxandiris.online/relaying-github-webhooks</guid>
      <pubDate>Sat, 28 Oct 2023 11:48:52 +0000</pubDate>
    </item>
    <item>
      <title>PySimpleGUI and NVDA Voicemeeter</title>
      <link>https://blog.onyxandiris.online/pysimplegui-and-nvda-voicemeeter</link>
      <description>&lt;![CDATA[As mentioned in the previous post Voicemeeter Accessibility for the Blind I chose to work with PySimpleGUI when developing the NVDA Voicemeeter application. &#xA;&#xA;!--more--&#xA;&#xA;For the following reasons:&#xA;&#xA;A chance to work with a new framework.&#xA;I&#39;m somewhat familiar with Tkinter (one of the frameworks PySimpleGUI is based on)&#xA;It&#39;s use of standard Python types to abstract away from geometry managers.&#xA;It&#39;s use of a messaging system for it&#39;s event loop.&#xA;The speed at which you can throw up simple ideas into workable GUIs.&#xA;&#xA;To give a quick example of what I mean I&#39;ll borrow this snippet from the docs.&#xA;&#xA;import PySimpleGUI as sg&#xA;&#xA;layout = [&#xA;    [sg.Text(&#34;What&#39;s your name?&#34;)],&#xA;    [sg.Input(key=&#34;-INPUT-&#34;)],&#xA;    [sg.Text(size=(40, 1), key=&#34;-OUTPUT-&#34;)],&#xA;    [sg.Button(&#34;Ok&#34;), sg.Button(&#34;Quit&#34;)],&#xA;]&#xA;&#xA;window = sg.Window(&#34;Window Title&#34;, layout)&#xA;&#xA;while True:&#xA;    event, values = window.read()&#xA;    if event == sg.WINDOWCLOSED or event == &#34;Quit&#34;:&#xA;        break&#xA;&#xA;    window[&#34;-OUTPUT-&#34;].update(f&#34;Hello {values[&#39;-INPUT-&#39;]}!&#34;)&#xA;&#xA;window.close()&#xA;&#xA;Which produces the following window:&#xA;&#xA; img src=&#34;https://img.onyxandiris.online/api/photo/pysimplgui-exampleSOlYf6ZA.png?token=wP1doFC4&#34; alt=&#34;PySimpleGUI example&#34; &#xA;&#xA;As you can see, the code closely resembles the GUI that it represents. Where you would typically place a widget onto a row or column, with PySimpleGUI you can instead place them into lists, or lists of lists.&#xA;&#xA;In the NVDA Voicemeeter codebase I was able to make this idea scale by creating a Builder class with steps defined as methods and then calling each step in turn. For example, when laying out the Hardware Input buttonmenus I did the following:&#xA;&#xA;    def maketab0row0(self) -  psg.Frame:&#xA;        &#34;&#34;&#34;tab0 row0 represents hardware ins&#34;&#34;&#34;&#xA;&#xA;        def addphysicaldeviceopts(layout):&#xA;            devices = util.getinputdevicelist(self.vm)&#xA;            devices.append(&#34;- remove device selection -&#34;)&#xA;            layout.append(&#xA;                [&#xA;                    psg.ButtonMenu(&#xA;                        f&#34;IN {i + 1}&#34;,&#xA;                        size=(6, 3),&#xA;                        menudef=[&#34;&#34;, devices],&#xA;                        key=f&#34;HARDWARE IN||{i + 1}&#34;,&#xA;                    )&#xA;                    for i in range(self.kind.physin)&#xA;                ]&#xA;            )&#xA;&#xA;        hardwarein = []&#xA;        [step(hardwarein) for step in (addphysicaldeviceopts,)]&#xA;        return psg.Frame(&#34;Hardware In&#34;, hardwarein)&#xA;&#xA;Where a list used to represent the layout was passed to a builder method which in turn placed each ButtonMenu element sequentially.&#xA;&#xA;I was then able to make this scale further using the same idea for each tab. Importantly this gave me the freedom to structure dynamically, according to each kind of Voicemeeter the precise layout of the rows.&#xA;&#xA;    layout0 = []&#xA;    if self.kind.name == &#34;basic&#34;:&#xA;        steps = (&#xA;            self.maketab0row0,&#xA;            self.maketab0row1,&#xA;            self.maketab0row5,&#xA;        )&#xA;    else:&#xA;        steps = (&#xA;            self.maketab0row0,&#xA;            self.maketab0row1,&#xA;            self.maketab0row2,&#xA;            self.maketab0row3,&#xA;            self.maketab0row4,&#xA;            self.maketab0row5,&#xA;        )&#xA;    for step in steps:&#xA;        layout0.append([step()])&#xA;&#xA;hr&#xA;&#xA;Next I&#39;ll talk a bit about the event loop. Unlike other frameworks I&#39;ve worked with, PySimpleGUI events are not based on callbacks but instead an event loop message queue. Specifically, by initiating a while loop and evaluating the result of the read() method on the main window object we receive event data represented by an event string and a values dictionary. Like so:&#xA;&#xA;    while True:&#xA;        event, values = self.read()&#xA;        self.logger.debug(f&#34;event::{event}&#34;)&#xA;        self.logger.debug(f&#34;values::{values}&#34;)&#xA;        if event in (psg.WINCLOSED, &#34;Exit&#34;):&#xA;            break&#xA;&#xA;This gave me the idea to employ the pyparsing library. It describes itself as an alternative approach to creating and executing simple grammars, vs. the traditional lex/yacc approach, or the use of regular expressions. Since I already had a good idea what the event identifiers would look like (Channel type, index, property type and so on), I figured this was an ideal approach to parsing the event loop. By defining a parser that could split the widget type from the parameter it represents and the event that triggered it, I was able to parse events such as this:&#xA;&#xA;    case [[&#34;BUS&#34;, index], [param], [&#34;KEY&#34;, &#34;SPACE&#34; | &#34;ENTER&#34;]]:&#xA;        if param == &#34;MODE&#34;:&#xA;            util.opencontextmenuforbuttonmenu(self, f&#34;BUS {index}||MODE&#34;)&#xA;        else:&#xA;            self.findelementwithfocus().click()&#xA;&#xA;This for example allowed me to define the action taken when space or enter were pressed on any element representing a Bus class parameter.&#xA;&#xA;hr&#xA;&#xA;All in all I was pleased with my choice to investigate the PySimpleGUI library. It let me spend more time focusing on the functionality and less time thinking about layouts and callbacks.&#xA;&#xA;The only roadblock I did come across were the ButtonMenu elements. By default I was unable to open the context menus with a keyboard, only with a mouse. After reaching out to the PSG devs they were able to inform me that by modifying the underlying Widget object I could make ButtonMenus focusable by a keyboard.&#xA;&#xA;    buttonmenuopts = {&#34;takefocus&#34;: 1, &#34;highlightthickness&#34;: 1}&#xA;    for i in range(self.kind.physin):&#xA;        self[f&#34;HARDWARE IN||{i + 1}&#34;].Widget.config(*buttonmenuopts)&#xA;&#xA;hr&#xA;&#xA;I will finish off by talking about the NVDA controller client. The api it presents is only small, exporting just four functions:&#xA;&#xA;/ commstatus / errorstatust stdcall nvdaControllertestIfRunning( void);&#xA;&#xA;/ commstatus / errorstatust _stdcall nvdaControllerspeakText( &#xA;    / string / const wchart text);&#xA;&#xA;/ commstatus / errorstatust stdcall nvdaControllercancelSpeech( void);&#xA;&#xA;/ commstatus / errorstatust _stdcall nvdaControllerbrailleMessage( &#xA;    / string / const wchart message);&#xA;&#xA;The one I was most concerned with was nvdaControllerspeakText but nonetheless since the API was so small I decided to define bindings for all four functions and present them in a wrapper class in Python:&#xA;&#xA;class CBindings:&#xA;    bindtestifrunning = libc.nvdaControllertestIfRunning&#xA;    bindspeaktext = libc.nvdaControllerspeakText&#xA;    ...&#xA;&#xA;    def call(self, fn, args, ok=(0,)):&#xA;        retval = fn(*args)&#xA;        if retval not in ok:&#xA;            raise NVDAVMCAPIError(fn.name, retval)&#xA;        return retval&#xA;&#xA;class Nvda(CBindings):&#xA;    @property&#xA;    def isrunning(self):&#xA;        return self.call(self.bindtestifrunning) == 0&#xA;&#xA;    def speak(self, text):&#xA;        self.call(self.bindspeaktext, text)&#xA;    ...&#xA;&#xA;This allowed me to add auditory feedback on both Focus In events and parameter changes caused by user input. An example of this, when focusing on a tabgroup:&#xA;&#xA;    case [&#34;CTRL-TAB&#34;] | [&#34;CTRL-SHIFT-TAB&#34;]:&#xA;        self[&#34;tabgroup&#34;].setfocus()&#xA;        self.nvda.speak(f&#34;{values[&#39;tabgroup&#39;]}&#34;)&#xA;&#xA;This was a fairly extensive task since by default NVDA screen reader was unable to recognise any of the elements that PySimpleGUI presents.&#xA;&#xA;hr&#xA;&#xA;This is the first time I&#39;ve attempted to develop an accessible app. I&#39;m pleased with the result and very grateful for the support and feedback I received during development. &#xA;&#xA;Since writing the GUI a few people have reached out to express their gratitude. My aim from the beginning was to help those who find navigating Voicemeeter troublesome, so if this tool assists them then it was all worth the effort.&#xA;&#xA;Further notes:&#xA;&#xA;As of July 2024 PySimpleGUI is no longer open source and requires a license which can be purchased from their site.&#xA;An open source fork has been created which offers all the code up until the last LGPL3 commit, check it out at the FreeSimpleGUI repository&#xA;&#xA;Subscribe to this blog&#39;s RSS feed]]&gt;</description>
      <content:encoded><![CDATA[<p>As mentioned in the previous post <a href="https://blog.onyxandiris.online/voicemeeter-accessibility-for-the-blind">Voicemeeter Accessibility for the Blind</a> I chose to work with <a href="https://github.com/PySimpleGUI">PySimpleGUI</a> when developing the NVDA Voicemeeter application.</p>



<p>For the following reasons:</p>
<ul><li>A chance to work with a new framework.</li>
<li>I&#39;m somewhat familiar with Tkinter (one of the frameworks PySimpleGUI is based on)</li>
<li>It&#39;s use of standard Python types to abstract away from geometry managers.</li>
<li>It&#39;s use of a messaging system for it&#39;s event loop.</li>
<li>The speed at which you can throw up simple ideas into workable GUIs.</li></ul>

<p>To give a quick example of what I mean I&#39;ll borrow this snippet from the docs.</p>

<pre><code class="language-python">import PySimpleGUI as sg

layout = [
    [sg.Text(&#34;What&#39;s your name?&#34;)],
    [sg.Input(key=&#34;-INPUT-&#34;)],
    [sg.Text(size=(40, 1), key=&#34;-OUTPUT-&#34;)],
    [sg.Button(&#34;Ok&#34;), sg.Button(&#34;Quit&#34;)],
]

window = sg.Window(&#34;Window Title&#34;, layout)

while True:
    event, values = window.read()
    if event == sg.WINDOW_CLOSED or event == &#34;Quit&#34;:
        break

    window[&#34;-OUTPUT-&#34;].update(f&#34;Hello {values[&#39;-INPUT-&#39;]}!&#34;)

window.close()
</code></pre>

<p>Which produces the following window:</p>

<p> <img src="https://img.onyxandiris.online/api/photo/pysimplgui-example_SOlYf6ZA.png?token=wP1doFC4" alt="PySimpleGUI example"></p>

<p>As you can see, the code closely resembles the GUI that it represents. Where you would typically place a widget onto a row or column, with PySimpleGUI you can instead place them into lists, or lists of lists.</p>

<p>In the NVDA Voicemeeter codebase I was able to make this idea scale by creating a Builder class with steps defined as methods and then calling each step in turn. For example, when laying out the Hardware Input buttonmenus I did the following:</p>

<pre><code class="language-python">    def make_tab0_row0(self) -&gt; psg.Frame:
        &#34;&#34;&#34;tab0 row0 represents hardware ins&#34;&#34;&#34;

        def add_physical_device_opts(layout):
            devices = util.get_input_device_list(self.vm)
            devices.append(&#34;- remove device selection -&#34;)
            layout.append(
                [
                    psg.ButtonMenu(
                        f&#34;IN {i + 1}&#34;,
                        size=(6, 3),
                        menu_def=[&#34;&#34;, devices],
                        key=f&#34;HARDWARE IN||{i + 1}&#34;,
                    )
                    for i in range(self.kind.phys_in)
                ]
            )

        hardware_in = []
        [step(hardware_in) for step in (add_physical_device_opts,)]
        return psg.Frame(&#34;Hardware In&#34;, hardware_in)
</code></pre>

<p>Where a list used to represent the layout was passed to a builder method which in turn placed each ButtonMenu element sequentially.</p>

<p>I was then able to make this scale further using the same idea for each tab. Importantly this gave me the freedom to structure dynamically, according to each kind of Voicemeeter the precise layout of the rows.</p>

<pre><code class="language-python">    layout0 = []
    if self.kind.name == &#34;basic&#34;:
        steps = (
            self.make_tab0_row0,
            self.make_tab0_row1,
            self.make_tab0_row5,
        )
    else:
        steps = (
            self.make_tab0_row0,
            self.make_tab0_row1,
            self.make_tab0_row2,
            self.make_tab0_row3,
            self.make_tab0_row4,
            self.make_tab0_row5,
        )
    for step in steps:
        layout0.append([step()])
</code></pre>

<hr>

<p>Next I&#39;ll talk a bit about the event loop. Unlike other frameworks I&#39;ve worked with, PySimpleGUI events are not based on callbacks but instead an event loop message queue. Specifically, by initiating a while loop and evaluating the result of the read() method on the main window object we receive event data represented by an event string and a values dictionary. Like so:</p>

<pre><code class="language-python">    while True:
        event, values = self.read()
        self.logger.debug(f&#34;event::{event}&#34;)
        self.logger.debug(f&#34;values::{values}&#34;)
        if event in (psg.WIN_CLOSED, &#34;Exit&#34;):
            break
</code></pre>

<p>This gave me the idea to employ the <a href="https://pypi.org/project/pyparsing/">pyparsing</a> library. It describes itself as <code>an alternative approach to creating and executing simple grammars, vs. the traditional lex/yacc approach, or the use of regular expressions</code>. Since I already had a good idea what the event identifiers would look like (Channel type, index, property type and so on), I figured this was an ideal approach to parsing the event loop. By defining a parser that could split the widget type from the parameter it represents and the event that triggered it, I was able to parse events such as this:</p>

<pre><code class="language-python">    case [[&#34;BUS&#34;, index], [param], [&#34;KEY&#34;, &#34;SPACE&#34; | &#34;ENTER&#34;]]:
        if param == &#34;MODE&#34;:
            util.open_context_menu_for_buttonmenu(self, f&#34;BUS {index}||MODE&#34;)
        else:
            self.find_element_with_focus().click()
</code></pre>

<p>This for example allowed me to define the action taken when space or enter were pressed on any element representing a Bus class parameter.</p>

<hr>

<p>All in all I was pleased with my choice to investigate the PySimpleGUI library. It let me spend more time focusing on the functionality and less time thinking about layouts and callbacks.</p>

<p>The only roadblock I did come across were the ButtonMenu elements. By default I was unable to open the context menus with a keyboard, only with a mouse. After reaching out to the PSG devs they were able to inform me that by modifying the underlying Widget object I could make ButtonMenus focusable by a keyboard.</p>

<pre><code class="language-python">    buttonmenu_opts = {&#34;takefocus&#34;: 1, &#34;highlightthickness&#34;: 1}
    for i in range(self.kind.phys_in):
        self[f&#34;HARDWARE IN||{i + 1}&#34;].Widget.config(**buttonmenu_opts)
</code></pre>

<hr>

<p>I will finish off by talking about the NVDA controller client. The api it presents is only small, exporting just four functions:</p>

<pre><code class="language-C">/* [comm_status][fault_status] */ error_status_t __stdcall nvdaController_testIfRunning( void);

/* [comm_status][fault_status] */ error_status_t __stdcall nvdaController_speakText( 
    /* [string][in] */ const wchar_t *text);

/* [comm_status][fault_status] */ error_status_t __stdcall nvdaController_cancelSpeech( void);

/* [comm_status][fault_status] */ error_status_t __stdcall nvdaController_brailleMessage( 
    /* [string][in] */ const wchar_t *message);
</code></pre>

<p>The one I was most concerned with was <code>nvdaController_speakText</code> but nonetheless since the API was so small I decided to define bindings for all four functions and present them in a wrapper class in Python:</p>

<pre><code class="language-python">class CBindings:
    bind_test_if_running = libc.nvdaController_testIfRunning
    bind_speak_text = libc.nvdaController_speakText
    ...

    def call(self, fn, *args, ok=(0,)):
        retval = fn(*args)
        if retval not in ok:
            raise NVDAVMCAPIError(fn.__name__, retval)
        return retval


class Nvda(CBindings):
    @property
    def is_running(self):
        return self.call(self.bind_test_if_running) == 0

    def speak(self, text):
        self.call(self.bind_speak_text, text)
    ...
</code></pre>

<p>This allowed me to add auditory feedback on both Focus In events and parameter changes caused by user input. An example of this, when focusing on a tabgroup:</p>

<pre><code class="language-python">    case [&#34;CTRL-TAB&#34;] | [&#34;CTRL-SHIFT-TAB&#34;]:
        self[&#34;tabgroup&#34;].set_focus()
        self.nvda.speak(f&#34;{values[&#39;tabgroup&#39;]}&#34;)
</code></pre>

<p>This was a fairly extensive task since by default NVDA screen reader was unable to recognise any of the elements that PySimpleGUI presents.</p>

<hr>

<p>This is the first time I&#39;ve attempted to develop an accessible app. I&#39;m pleased with the result and very grateful for the support and feedback I received during development.</p>

<p>Since writing the GUI a few people have reached out to express their gratitude. My aim from the beginning was to help those who find navigating Voicemeeter troublesome, so if this tool assists them then it was all worth the effort.</p>

<p>Further notes:</p>
<ul><li>As of July 2024 PySimpleGUI is no longer open source and requires a license which can be purchased from their site.</li>
<li>An open source fork has been created which offers all the code up until the last LGPL3 commit, check it out at the <a href="https://github.com/spyoungtech/FreeSimpleGUI">FreeSimpleGUI repository</a></li></ul>

<p>Subscribe to this blog&#39;s <a href="https://blog.onyxandiris.online/feed/">RSS feed</a></p>
]]></content:encoded>
      <guid>https://blog.onyxandiris.online/pysimplegui-and-nvda-voicemeeter</guid>
      <pubDate>Tue, 24 Oct 2023 13:57:59 +0000</pubDate>
    </item>
    <item>
      <title>Voicemeeter Accessibility for the Blind</title>
      <link>https://blog.onyxandiris.online/voicemeeter-accessibility-for-the-blind</link>
      <description>&lt;![CDATA[Voicemeeter is an excellent application for those who can navigate it with ease, however, for those with accessibility needs it can be a very frustrating experience. I know this for sure having spoken to a number of visually impaired users.&#xA;&#xA;!--more--&#xA;&#xA;After being approached by Mario Loreti, an Italian voice talent who uses Voicemeter in his professional work, I decided to take on the task of developing an accessible app that would work with a screen reader.&#xA;&#xA;---&#xA;&#xA;Step one, pick the screen reader. I chose NVDA since it&#39;s open source, offers an extensive API, an add-on ecosystem and even a Controller Client exposing functions through a standard Windows DLL.&#xA;&#xA;Most NVDA add-ons take one of two forms:&#xA;&#xA;an extension to an existing GUI that reads events emitted by user controls&#xA;a standalone application that hooks into the Controller client&#xA;&#xA;Voicemeeter was written using the Win32 API in C, therefore, it does not emit the kind of events required by the NVDA API. For this reason my only option was to develop a standalone application.&#xA;&#xA;---&#xA;&#xA;Step two, choose the language and framework. I strongly considered C# and WPF but given I had a lot of new accessibility topics to learn and having already written a Voicemeeter GUI in python, I decided to stick with the familiar. That said, I didn&#39;t want to simply write another GUI in Tkinter so I chose to investigate PySimpleGUI. It&#39;s essentially a wrapper around multiple frameworks but it offers some interesting ideas which I&#39;ll go into in the follow up post.&#xA;&#xA;---&#xA;&#xA;Next I had to decide on the layout of the application. As outlined in the software specification, Voicemeeter comes in three versions, Basic, Banana and Potato. Each scales differently and some controls exist only for certain versions. All of this had to be considered when developing the NVDA Voicemeeter application.&#xA;&#xA;I knew beforehand there were two particular areas of difficulty for visually impaired users, the settings menu and the GUI sliders. So I started with the settings tab:&#xA;&#xA;Settings tab&#xA;&#xA;In order to conform with the software specification you will see that all elements of the GUI are standard Windows controls. Each of the Hardware In/Out buttons offers context menus, Patch ASIO Inputs to Strips use spinboxes while Patch Inserts use checkboxes. &#xA;&#xA;---&#xA;&#xA;Next I decided to layout the channel buttons:&#xA;&#xA;Channel buttons&#xA;&#xA;As you can see in the Potato version there are a lot of buttons, so after some discussion with Mario I decided to split the buttons from the sliders using nested tabs. &#xA;&#xA;Here are the sliders:&#xA;&#xA;Channel sliders&#xA;&#xA;---&#xA;&#xA;Finally I added a menu element.&#xA;&#xA;menu&#xA;&#xA;It&#39;s simple but important, giving users the option to save current settings (as a standard Voicemeeter XML profile), load previous settings and set a profile to load automatically on launch.&#xA;&#xA;---&#xA;&#xA;I will write a follow up post going into more detail about the code and the PySimpleGui library.&#xA;&#xA;You may check out the source code along with usage instructions.&#xA;&#xA;For a user friendly version of the README check the NVDA Voicemeeter page on our site. You will also find download links for pre-build releases.&#xA;&#xA;Subscribe to this blog&#39;s RSS feed]]&gt;</description>
      <content:encoded><![CDATA[<p><a href="https://voicemeeter.com/">Voicemeeter</a> is an excellent application for those who can navigate it with ease, however, for those with accessibility needs it can be a very frustrating experience. I know this for sure having spoken to a number of visually impaired users.</p>



<p>After being approached by <a href="https://www.marioloreti.net/en/">Mario Loreti</a>, an Italian voice talent who uses Voicemeter in his professional work, I decided to take on the task of developing an accessible app that would work with a screen reader.</p>

<hr>

<p>Step one, pick the screen reader. I chose <a href="https://www.nvaccess.org/about-nvda/">NVDA</a> since it&#39;s open source, offers an extensive API, an add-on ecosystem and even a <a href="https://github.com/nvaccess/nvda/tree/master/extras/controllerClient">Controller Client</a> exposing functions through a standard Windows DLL.</p>

<p>Most NVDA add-ons take one of two forms:</p>
<ul><li>an extension to an existing GUI that reads events emitted by user controls</li>
<li>a standalone application that hooks into the Controller client</li></ul>

<p>Voicemeeter was written using the Win32 API in C, therefore, it does not emit the kind of events required by the NVDA API. For this reason my only option was to develop a standalone application.</p>

<hr>

<p>Step two, choose the language and framework. I strongly considered C# and WPF but given I had a lot of new accessibility topics to learn and having already written a Voicemeeter GUI in python, I decided to stick with the familiar. That said, I didn&#39;t want to simply write another GUI in Tkinter so I chose to investigate <a href="https://github.com/PySimpleGUI/PySimpleGUI">PySimpleGUI</a>. It&#39;s essentially a wrapper around multiple frameworks but it offers some interesting ideas which I&#39;ll go into in the follow up post.</p>

<hr>

<p>Next I had to decide on the layout of the application. As outlined in the <a href="https://git.onyxandiris.online/onyx_online/nvda-voicemeeter/src/branch/dev/SPECIFICATION.md">software specification</a>, Voicemeeter comes in three versions, Basic, Banana and Potato. Each scales differently and some controls exist only for certain versions. All of this had to be considered when developing the NVDA Voicemeeter application.</p>

<p>I knew beforehand there were two particular areas of difficulty for visually impaired users, the settings menu and the GUI sliders. So I started with the settings tab:</p>

<p><img src="https://img.onyxandiris.online/api/photo/settings_CRhnmQI3.png?token=vHlcaNmd" alt="Settings tab"></p>

<p>In order to conform with the software specification you will see that all elements of the GUI are standard Windows controls. Each of the Hardware In/Out buttons offers context menus, Patch ASIO Inputs to Strips use spinboxes while Patch Inserts use checkboxes.</p>

<hr>

<p>Next I decided to layout the channel buttons:</p>

<p><img src="https://img.onyxandiris.online/api/photo/channel-buttons_OZf7tctq.png?token=47WBhQkV" alt="Channel buttons"></p>

<p>As you can see in the Potato version there are a lot of buttons, so after some discussion with Mario I decided to split the buttons from the sliders using nested tabs.</p>

<p>Here are the sliders:</p>

<p><img src="https://img.onyxandiris.online/api/photo/sliders_pB1tfm2q.png?token=b2hEpeO2" alt="Channel sliders"></p>

<hr>

<p>Finally I added a menu element.</p>

<p><img src="https://img.onyxandiris.online/api/photo/menu-element_YmbGwnLS.png?token=CsVppOqo" alt="menu"></p>

<p>It&#39;s simple but important, giving users the option to save current settings (as a standard Voicemeeter XML profile), load previous settings and set a profile to load automatically on launch.</p>

<hr>

<p>I will write a follow up post going into more detail about the code and the PySimpleGui library.</p>

<p>You may check out the <a href="https://git.onyxandiris.online/onyx_online/nvda-voicemeeter">source code</a> along with usage instructions.</p>

<p>For a user friendly version of the README check the <a href="https://onyxandiris.online/nvda-voicemeeter">NVDA Voicemeeter</a> page on our site. You will also find download links for pre-build releases.</p>

<p>Subscribe to this blog&#39;s <a href="https://blog.onyxandiris.online/feed/">RSS feed</a></p>
]]></content:encoded>
      <guid>https://blog.onyxandiris.online/voicemeeter-accessibility-for-the-blind</guid>
      <pubDate>Wed, 18 Oct 2023 14:22:18 +0000</pubDate>
    </item>
  </channel>
</rss>