<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:media="http://search.yahoo.com/mrss/"><channel><title><![CDATA[dan_puzey]]></title><description><![CDATA[Random musings of an eclectic geek.]]></description><link>https://blog.puzey.net/</link><generator>Ghost 4.48</generator><lastBuildDate>Sat, 04 Apr 2026 09:10:01 GMT</lastBuildDate><atom:link href="https://blog.puzey.net/rss/" rel="self" type="application/rss+xml"/><ttl>60</ttl><item><title><![CDATA[Logitech GHub: fixing my G903]]></title><description><![CDATA[Getting the G903 "button on the bottom" working using GHub.]]></description><link>https://blog.puzey.net/logitech-ghub-bind-all-the-things/</link><guid isPermaLink="false">606f3b55506b17000137118d</guid><category><![CDATA[hardware]]></category><dc:creator><![CDATA[Dan Puzey]]></dc:creator><pubDate>Sun, 30 Aug 2020 12:04:11 GMT</pubDate><media:content url="https://blog.puzey.net/content/images/2020/08/20200830_125205.jpg" medium="image"/><content:encoded><![CDATA[<img src="https://blog.puzey.net/content/images/2020/08/20200830_125205.jpg" alt="Logitech GHub: fixing my G903"><p>It started with an annoying day for Logitech updates:</p><figure class="kg-card kg-embed-card"><blockquote class="twitter-tweet"><p lang="en" dir="ltr">Blurgh, <a href="https://twitter.com/LogitechG?ref_src=twsrc%5Etfw">@LogitechG</a> strikes again and I&apos;ve lost my mouse profiles.&#x1F62D;<br><br>If anyone needs me, I&apos;ll just be reinstalling a 2-year-old LGS installer because that _actually works_. <a href="https://twitter.com/hashtag/sigh?src=hash&amp;ref_src=twsrc%5Etfw">#sigh</a></p>&#x2014; Dan Puzey (@DanPuzey) <a href="https://twitter.com/DanPuzey/status/1300021088475320320?ref_src=twsrc%5Etfw">August 30, 2020</a></blockquote>
<script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script>
</figure><p>I installed GHub a while back because it was the new upgrade from Logitech Gaming Software, but had found it just <em>didn&apos;t work</em> as well as the old software. My main bugbear was that<em> </em>the button on the bottom of the mouse, which typically switches between profiles, wasn&apos;t bindable in the app, despite what their manual says:</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://blog.puzey.net/content/images/2020/08/image-2.png" class="kg-image" alt="Logitech GHub: fixing my G903" loading="lazy"><figcaption>These arrow buttons (that flip the mouse so you can bind the bottom buttons) don&apos;t appear for my G903</figcaption></figure><p>This meant I could only switch between my &quot;desktop&quot; and &quot;gaming&quot; profiles by opening the app and clicking around, which is too slow for something done so regularly. So, I&apos;d ended up sticking with the onboard profiles, retained on the device from the previous software. Not ideal, but functional, and the profile-switch button worked.</p><p>Then this morning came a firmware update and, when I installed it, the onboard profiles were lost. My mouse was back to factory settings, and I had no way to reinstate the bindings so my button worked! I was about to reinstall LGS, but decided to have a poke first.</p><p>The <a href="https://docs.microsoft.com/en-us/sysinternals/">SysInternals</a> ProcMon tool helped me find the file that contains my profile. It&apos;s only written to when the app window closes, and it&apos;s written to by a different process (<code>lghub_agent</code>) but I found it in the end! It&apos;s a <code>json</code> file, and it lives here: <code>C:\Users\&lt;user&gt;\AppData\Local\LGHUB\settings.json</code></p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://blog.puzey.net/content/images/2020/08/image-3.png" class="kg-image" alt="Logitech GHub: fixing my G903" loading="lazy"><figcaption>Hurrah for ProcMon!</figcaption></figure><p>Inside that file, all the button bindings are clearly visible in an array under <code>profiles[x].assignments</code>, and have a consistent form: </p><pre><code class="language-json">{
    &quot;cardId&quot;: &quot;0f82f693-5b78-4cf5-867e-080100000000&quot;,
    &quot;slotId&quot;: &quot;g900_g4_m1&quot;
}</code></pre><p><code>slotId</code> refers to the physical button, but what&apos;s <code>cardId</code>? I could find no reference to map this value to a function. Then, back in the app, I noticed this:</p><figure class="kg-card kg-image-card"><img src="https://blog.puzey.net/content/images/2020/08/image-1.png" class="kg-image" alt="Logitech GHub: fixing my G903" loading="lazy"></figure><p>In onboard mode, pressing the button on the base of the mouse still switches profiles. Could this be why?</p><p>I pasted <code>settings.json</code> into one side of a <a href="https://www.scootersoftware.com/">BeyondCompare</a> text compare, then bound one of the available buttons (G4) to &quot;Onboard Profile Cycle&quot;, and compared the updated <code>settings.json</code>. The <code>g900_g4_m1</code> entry was updated in the file, and it was the same <code>cardId</code> as the <code>g900_g12</code> button entries. Looked like I was in business: the bottom button is &quot;G12&quot; and the bindings are in the config file!</p><p>From here it&apos;s pretty straightfoward: I bound the thumb button to the &quot;G Hub Profile Cycle&quot; and noted the new <code>cardId</code> from the updated settings file: <code>0f82f693-5b78-4cf5-867e-080100000000</code>. This is the one I needed!</p><p>I shut down GHub completely so I could hand-edit the settings, and replaced all <code>g900_g12</code> button bindings with the new guid (there&apos;s a prompt in the GHub app that suggests the button has to be bound in <em>all </em>profiles for some reason). Restarting the software I found that, finally, my underside button switches profiles on my mouse again!</p><p>... and it only took an hour to work it all out. If this comes up again, or for anyone else, hopefully this post will save some time!</p><p><strong>UPDATE</strong>: Logitech replied to my tweet and, and apparently this isn&apos;t <em>supposed</em> to work. It seems they&apos;re as confused as anyone...</p><figure class="kg-card kg-embed-card"><blockquote class="twitter-tweet"><p lang="en" dir="ltr">Hi Dan. Thanks for getting in touch but yes this is known, the G903 doesn&apos;t have one. It has the power switch and on-board profile cycle button at the bottom. Hope this helps.</p>&#x2014; Logitech G UK (@LogitechGUK) <a href="https://twitter.com/LogitechGUK/status/1301461960148516865?ref_src=twsrc%5Etfw">September 3, 2020</a></blockquote>
<script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script>
</figure>]]></content:encoded></item><item><title><![CDATA[Self-hosting Ghost on Azure]]></title><description><![CDATA[How to (relatively) simply spin up a Ghost blog using Azure Container Instances and Storage Accounts.]]></description><link>https://blog.puzey.net/self-hosting-ghost-on-azure/</link><guid isPermaLink="false">606f3b55506b17000137118c</guid><category><![CDATA[azure]]></category><category><![CDATA[dev]]></category><dc:creator><![CDATA[Dan Puzey]]></dc:creator><pubDate>Sun, 17 Nov 2019 18:05:00 GMT</pubDate><content:encoded><![CDATA[<p>Since I first discovered it, I&apos;ve been running my blog on <a href="https://ghost.org">Ghost</a>. But, being the cheapskate/curious developer I am, I&apos;ve been self-hosting.</p><p>Ghost was never really set up to run on Azure though, and there&apos;s always been a number of hoops to jump through to make that work. My original solution was a fork of <a href="https://github.com/TryGhost/">Ghost&apos;s code</a> and <a href="https://github.com/TryGhost/Casper">the default theme</a> with the tweaks I needed to get it running and add <a href="https://disqus.com/">Disqus</a> and such to the theme. I had this set up to auto-deploy from Git to an Azure webapp... but of course, over time, I fell behind updating from the upstream repo and my blog grew dusty.</p><p>What I wanted instead was a more lightweight approach: I didn&apos;t want the hassle of repos or manual code updates. A containerised solution made sense, but I didn&apos;t want to have to deal with managing docker hosting infrastructure either. What would have been ideal was a simple container deployment, with storage mapped out to the cloud so the data is properly persistent.</p><h2 id="say-hello-to-container-instances">Say hello to Container Instances</h2><p><a href="https://azure.microsoft.com/en-gb/services/container-instances/">Container instances</a> are a relatively recent addition to Azure&apos;s service. They allow for lightweight containerised deployments with the bare minimum of configuration. In my case, I could deploy docker containers necessary for my blog hosting with just a couple of files and minimal configuration: all of the deployed containers are straight from the public registry, and configured through mapped volumes and environment variables. Future updates will be straightforward since I can simply redeploy the configuration to pick up the latest, updated, images.</p><p>The deployment can be specified as JSON or YAML - I used the latter as was more lightweight. I also needed a simple Nginx config, and a few Azure file shares for the data persistence.</p><h2 id="annoyances-first">Annoyances first</h2><p>So, bad news up front. The first big issue I ran into was that, for unknown reason, the Ghost container doesn&apos;t like writing its database to an Azure file share. It can create an empty database, but the migrations always throw a <code>MigrationsAreLockedError</code> and I couldn&apos;t get around that (either with MySql <em>or</em> SqlLite). I dug around this for a while to no avail, and eventually settled on running a dedicated MySql container, which worked fine with the mapped file share.</p><p>The other problems I ran into all have one root cause: Azure file shares don&apos;t support links. Ghost&apos;s theme handling relies on creating links to function (at least in the Docker container). The same goes for Certbot, but more on that later. The solution to these is, unfortunately, to leave some non-critical data (like my blog&apos;s theme) in volatile storage on the container.</p><h2 id="prerequisites">Prerequisites</h2><p>The first step is to create a storage account in Azure and add some file shares to it:</p><ul><li>one for the nginx config</li><li>one for the MySql database</li><li>one for the Ghost image files</li></ul><p>You&apos;ll need the account name and key for accessing the shares; you can find those under &quot;Access keys&quot; in the storage account&apos;s settings in the Azure portal. You&apos;ll also need a way of pushing files to the shares - either the <a href="https://azure.microsoft.com/en-gb/features/storage-explorer/">Azure Storage Explorer</a> or the <a href="https://docs.microsoft.com/en-us/azure/storage/files/storage-how-to-use-files-cli">Azure CLI</a>.</p><p>Nginx is going to need a config file for the site. A simple one will forward all of the public HTTP traffic to the Ghost container&apos;s port, and looks like this:</p><figure class="kg-card kg-code-card"><pre><code>server {
  listen 80 default_server;
  listen [::]:80 default_server;
  server_name puzey.net *.puzey.net;

  location / {
    proxy_pass http://127.0.0.1:2368;
    proxy_set_header    Host                $http_host;
    proxy_set_header    X-Real-IP           $remote_addr;
    proxy_set_header    X-Forwarded-For     $proxy_add_x_forwarded_for;
  }
}</code></pre><figcaption>A simple nginx .conf file for this site</figcaption></figure><p>You can see that the Ghost container will be addressed as <code>localhost</code>, as are all containers in the group, but on its own mapped port, which is not available publicly. This file should go in the nginx file share.</p><h2 id="the-deployment-file">The deployment file</h2><p>Now, we need the container instance Yaml file. I&apos;ll go through this in stages - but everything in this section should be in a single file.</p><p>First, some boilerplate:</p><pre><code class="language-YAML">apiVersion: &apos;2018-10-01&apos;
tags: null
type: Microsoft.ContainerInstance/containerGroups
location: southuk
name: your-container-name</code></pre><p>You can set the <code>tags</code>, <code>location</code> (i.e. the Azure region) and <code>name</code> to whatever suits you.</p><p>Next, we&apos;ll define our Azure file shares as volumes, and configure the public network profile:</p><pre><code class="language-YAML">properties:
  osType: Linux
  ipAddress:
    dnsNameLabel: someDnsName
    type: Public
    ports:
    - protocol: tcp
      port: 80
  volumes:
    - name: nginxconfig-volume
      azureFile:
        sharename: nginx-share
        storageAccountName: myAccountName
        storageAccountKey: myAccountKey
    - name: ghostdata-volume
      azureFile:
        sharename: mysql-data-share
        storageAccountName: myAccountName
        storageAccountKey: myAccountKey
    - name: ghostimages-volume
      azureFile:
        sharename: ghost-images-share
        storageAccountName: myAccountName
        storageAccountKey: myAccountKey</code></pre><p>Setting the <code>dnsNameLabel</code> makes your site available at a url of the form <code>someDnsName.uksouth.azurecontainer.io</code> , which can be useful since subsequent redeployments might not reuse the same IP address. The details three volumes should match those of the storage account.</p><p>Now, we can define the containers. These are nested within the <code>properties</code> key, so be careful with the indenting if you&apos;re copying directly from the post. The first container will be the MySql database:</p><pre><code class="language-YAML">  containers:
  - name: mysql
    properties:
      environmentVariables:
      - name: MYSQL_ROOT_PASSWORD
        value: this_is_required_but_will_be_overwritten
      - name: MYSQL_RANDOM_ROOT_PASSWORD
        value: yes
      - name: MYSQL_USER
        value: some_other_user
      - name: MYSQL_PASSWORD
        value: some_other_password
      - name: MYSQL_DATABASE
        value: some_database_name
      image: docker.io/mysql:5.7
      ports:
      - port: 3306
        protocol: tcp
      resources:
        requests:
          cpu: 0.5
          memoryInGb: 0.5
      volumeMounts:
      - mountPath: /var/lib/mysql
        name: ghostdata-volume</code></pre><p>Note that listing port 3306 in the definition maps that port to <code>localhost</code> for other containers in this file, without exposing it publicly. (Only the ports specified earlier are public). The Azure file share is mounted here so that the database writes into persistent storage, instead of keeping all our data within the container. Specifying the <code>MYSQL_DATABASE</code> environment variable means that the specified database will be created, with the defined user as admin.</p><p>We need Ghost itself, configured to use the MySql database:</p><pre><code class="language-YAML">  - name: ghost
    properties:
      environmentVariables:
      - name: url
        value: http://your_url_here
      - name: database__client
        value: mysql
      - name: database__connection__host
        value: localhost
      - name: database__connection__port
        value: 3306
      - name: database__connection__user
        value: your_db_user
      - name: database__connection__password
        value: your_db_password
      - name: database__connection__database
        value: your_db
      image: docker.io/ghost:3.0-alpine
      ports:
      - port: 2368
        protocol: tcp
      resources:
        requests:
          cpu: 0.5
          memoryInGb: 0.5
      volumeMounts:
      - mountPath: /var/lib/ghost/content/images
        name: ghostimages-volume</code></pre><p>There are a few things to note here.</p><ul><li>If you make the URL <code>https:</code> here (regardless of whether Nginx is set up to serve it), the Ghost container seems to get in a tizzy because it can&apos;t serve over SSL itself, and nothing works. The solution to this looks to be using Nginx to redirect, but this would mean every link on the blog is <code>http:</code> and then returns a redirect, which is horrible. That&apos;s not the only problem with SSL anyway, so it&apos;s moot (see later!).</li><li>The database user obviously should match whatever is configured for the MySql container. </li><li>The Azure file share is mounted so that any images uploaded to the blog are written to persistent storage outside of the container.</li></ul><p>Finally, we need an Nginx instance to serve everything:</p><pre><code class="language-YAML">  - name: nginx
    properties:
      command:
      - /bin/sh
      - -c
      - while :; do sleep 6h &amp; wait $$!; echo Reloading nginx config; nginx -s reload; done &amp; echo nginx starting; nginx -g &quot;daemon off;&quot;
      image: docker.io/nginx:mainline-alpine
      resources:
        requests:
          cpu: 0.5
          memoryInGb: 0.5
      ports:
      - port: 80
        protocol: tcp
      volumeMounts:
      - mountPath: /etc/nginx/conf.d/
        name: nginxconfig-volume</code></pre><p>Here, we map in the Azure file share that contains the config file. This means Nginx will actually serve the file. The port 80 exposed here is also exposed publicly, so the outside world can request the site over HTTP and Nginx will handle it.</p><p>The fiddly-looking <code>command</code> property means that Nginx will reload its configuration every 6 hours. That means that if the site configuration needs updating at any point, that can be done by simply uploading a new <code>.conf</code> file to the Azure share, and Nginx will pick it up at the next reload - rather than having to restart the container.</p><h3 id="a-note-on-resources">A note on resources</h3><p>Each container has to specify a CPU and memory resource request. The <em>sum</em> of all these requests is what will be allocated to the container group as a whole, but each container can grow beyond its individual requested amount. So, with the definitions above, Azure will allocate 1.5 CPUs and 1.5Gb of memory to the cluster, and if Nginx and Ghost only use 0.1Gb each, MySql can fill the remaining 1.3Gb.</p><p>These resource requests will also be the major factor in the resource charge, so it&apos;s worth bearing this in mind!</p><h2 id="deploying">Deploying</h2><p>Deploying the whole thing using the Azure CLI is a doddle:</p><pre><code>az container create --file .\my-def.yaml --resource-group some_group</code></pre><p>Make sure that the resource group name is one that already exists (or create one before running the above).</p><p>Initially on deployment you&apos;ll likely see that the Ghost container falls over and is restarted once or twice, while the MySql server spins up, but you should then see it running cleanly. The Azure portal lets you view the status and logs of each container in the group and (where possible) you can also launch a shell in the container to poke about if there are any issues.</p><h2 id="what-about-https">What about HTTPS?</h2><p>My earlier hosting was a standard Azure WebApp, and I had set up the <a href="https://github.com/sjkp/letsencrypt-siteextension">Let&apos;s Encrypt Azure Extension</a> in order to handle SSL. This was working nicely (and I still use it for the other sites that I host)... but this isn&apos;t an option with Container Instances.</p><p>There&apos;s an obvious solution though: Let&apos;s Encrypt provide <a href="https://certbot.eff.org/">CertBot</a>, which is a nice black-box dollop of cleverness that takes care of it all for you. There&apos;s even an official <a href="https://hub.docker.com/r/certbot/certbot/">Docker image</a> for it that should help in situations like this.</p><p>So, here&apos;s the theory: the CertBot container calls Let&apos;s Encrypt and request certificates for a configured list of domains. In order to authenticate, it has to serve back some challenge files from the domain in question, which LE will request over HTTP. (The alternative is DNS validation, but this would require me to hook things up outside of my container deployment, and I&apos;m trying to avoid that.)</p><p>This is accomplished via another couple of file shares, mapped into both to the CertBot container and the Nginx container. This allows the challenge files and certificates that CertBot uses to be served by Nginx, requiring only a simple tweak to the <code>.conf</code> file and the deployment YAML.</p><p>The problem is that once Certbot <em>has</em> the certificates, it uses a symlink to point at the latest requested cert. (This lets it keep a history of issued certs, but have the &quot;live&quot; one always at the same place.) Since the directory in question is mapped to the Azure file share, the link fails. I can <em>get</em> the certificates for my domain, I just can&apos;t <em>serve</em> them automatically.</p><p>I&apos;ll be looking into this again at some point in the future, but for now I&apos;d rather my blog was live and updated <em>without</em> HTTPS than languishing further while I figure it out.</p><p>Thoughts and feedback welcome!</p>]]></content:encoded></item><item><title><![CDATA[Why was Unity3d taking two hours to bake occlusion data?]]></title><description><![CDATA[<!--kg-card-begin: markdown--><p>This is the problem I was faced with when building <a href="http://thesignalfrom.com">The Signal From Tolva</a>. The builds would take hours and it was the OC bake causing issues.</p>
<p>I should pause to explain what I was doing in full. I value automation: it reduces risk, reduces effort, and makes it simpler</p>]]></description><link>https://blog.puzey.net/why-does-unity3d-take-two-hours-to-bake-occlusion-data/</link><guid isPermaLink="false">606f3b55506b17000137118b</guid><category><![CDATA[build]]></category><category><![CDATA[unity3d]]></category><dc:creator><![CDATA[Dan Puzey]]></dc:creator><pubDate>Wed, 26 Apr 2017 22:08:24 GMT</pubDate><content:encoded><![CDATA[<!--kg-card-begin: markdown--><p>This is the problem I was faced with when building <a href="http://thesignalfrom.com">The Signal From Tolva</a>. The builds would take hours and it was the OC bake causing issues.</p>
<p>I should pause to explain what I was doing in full. I value automation: it reduces risk, reduces effort, and makes it simpler for anyone to run a given process if the need arises. (In practise I run all of the builds, but the scripts I created mean that anyone <em>could</em> run a build if needs be. Always reduce the <a href="https://en.wikipedia.org/wiki/Bus_factor">bus factor</a>.)</p>
<p>A good build process should be consistent, fast, and assume as little as possible. In practise, the latter means that for things like occlusion culling (where the data you build with should be based on the latest state of the game), you want a fresh bake to run as part of the build process. If you don&apos;t do that, you risk building with out-of-date occlusion data (which could lead to objects popping in and out of visibility for no apparent reason) or with no occlusion culling at all. Neither of these are good news, so the build bakes every time.</p>
<p>The problem was that, after a couple of builds, the process slowed to a crawl. Unity would hang for 2+ hours, and the IDE log was always in the same place: at the end of the occlusion bake, after the line <code>INFO: Checking cache...</code> was written. Nothing else was output during the hang and, after the delay, the build would complete as normal.</p>
<p>Even asking a Unity dev about this yielded no results: it&apos;s middleware, and apparently nobody knew. So I delved deeper with <a href="https://technet.microsoft.com/en-us/sysinternals/processmonitor.aspx">SysInternals Procmon</a> and worked out that the OC bake builds a cache of files every time it runs.</p>
<p>This cache sits under <code>Library/Occlusion</code> in the project folder, and (on my machine after a handful of builds) had <em>hundreds of thousands</em> of files in it. Presumably these were stale cache data from previous bakes; it&apos;s not clear what the process is doing checking a cache <em>after</em> it completes, but I had a hunch that scanning 500k files wasn&apos;t going to be a fast process.</p>
<p>The solution from this point was as obvious as you&apos;d expect: the build script now deletes that folder before it runs the OC bake (the first time it ran, it took half an hour just to delete all those files!). Our automated build runs in a few minutes, as it should. And I get two hours&apos; extra work done on build days!</p>
<!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[Global Game Jam 2016]]></title><description><![CDATA[<!--kg-card-begin: markdown--><p>It seems I just took part in my first ever game jam and, to my surprise, I made something actually playable.</p>
<p><img src="https://pbs.twimg.com/tweet_video_thumb/CZ_PuExWYAEmf-5.png" alt="Streets Of Mage WIP" loading="lazy"></p>
<p>Thanks in huge part to the artistic and audio wizardry of <a href="https://twitter.com/john_arr">John Roberts</a> and <a href="https://twitter.com/nedymond">Nick Dymond</a>, the screenie above was actually only the halfway point and totally fails to</p>]]></description><link>https://blog.puzey.net/global-game-jam-2016/</link><guid isPermaLink="false">606f3b55506b170001371187</guid><category><![CDATA[unity3d]]></category><category><![CDATA[game-development]]></category><dc:creator><![CDATA[Dan Puzey]]></dc:creator><pubDate>Tue, 02 Feb 2016 00:29:18 GMT</pubDate><content:encoded><![CDATA[<!--kg-card-begin: markdown--><p>It seems I just took part in my first ever game jam and, to my surprise, I made something actually playable.</p>
<p><img src="https://pbs.twimg.com/tweet_video_thumb/CZ_PuExWYAEmf-5.png" alt="Streets Of Mage WIP" loading="lazy"></p>
<p>Thanks in huge part to the artistic and audio wizardry of <a href="https://twitter.com/john_arr">John Roberts</a> and <a href="https://twitter.com/nedymond">Nick Dymond</a>, the screenie above was actually only the halfway point and totally fails to capture the madness of what was, in the end, a &quot;who-can-do-random-combos-fastest&quot; game.</p>
<p>We started off with loftier ambitions that included <a href="https://twitter.com/John_Arr/status/693572641014337537">cats</a>, but for once common sense prevailed over scope-creep and we managed to polish the core idea to something that has kept a few people playing for the last couple of days.</p>
<p>Being the <em>*ahem*</em> professional that I am, I pushed the game code and assets thorugh Git, and you can check out the game itself, all of the code and assets, and the occasionally spurious commit history in the repo <a href="https://github.com/DanPuzey/StreetsOfMage">here</a>. It&apos;s a game written in Unity3d (v5.3 for those interested); it&apos;s local 2-player only, and requires two XBox controllers to play.</p>
<p>(If someone really wants to patch it for cross-platform or controller capability then please be my guest! ;-))</p>
<p>All in all, this was a fantastic weekend, and I look forward to my second jam with some anticipation!</p>
<!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[Handy Unity fact of the day]]></title><description><![CDATA[<!--kg-card-begin: markdown--><p>So, today&apos;s useful snippet, after too long debugging a related issue: if you have an <code>Awake()</code> method on a <code>MonoBehaviour</code> in Unity, and that method throws an exception that isn&apos;t handled, the component in question will be automatically disabled. on startup.</p>
<p>It turns out that tracking</p>]]></description><link>https://blog.puzey.net/handy-unity-fact-of-the-day/</link><guid isPermaLink="false">606f3b55506b170001371183</guid><category><![CDATA[unity3d]]></category><category><![CDATA[game-development]]></category><category><![CDATA[tips]]></category><dc:creator><![CDATA[Dan Puzey]]></dc:creator><pubDate>Mon, 13 Apr 2015 20:56:18 GMT</pubDate><content:encoded><![CDATA[<!--kg-card-begin: markdown--><p>So, today&apos;s useful snippet, after too long debugging a related issue: if you have an <code>Awake()</code> method on a <code>MonoBehaviour</code> in Unity, and that method throws an exception that isn&apos;t handled, the component in question will be automatically disabled. on startup.</p>
<p>It turns out that tracking down <em>why</em> something has been disabled or enabled in Unity is quite tricky - even if you add the <code>OnDisable()</code> method to the component, there&apos;s no callstack or passed-in information that lets you work out what happened.</p>
<!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[Unity's .SetActive vs .SetActiveRecursively]]></title><description><![CDATA[<!--kg-card-begin: markdown--><p>A story of how fixing warnings broke all my code. It&apos;s all very well obsoleting this stuff, but you have to make sure it&apos;s still compatible...</p>
<p>At some point during the 4.x cycle, <a href="http://unity3d.com">Unity</a> switched the way that a <code>GameObject</code> is activated and deactivated in</p>]]></description><link>https://blog.puzey.net/unitys-setactive-vs-setactiverecursively/</link><guid isPermaLink="false">606f3b55506b170001371182</guid><category><![CDATA[unity3d]]></category><dc:creator><![CDATA[Dan Puzey]]></dc:creator><pubDate>Thu, 09 Apr 2015 12:09:31 GMT</pubDate><content:encoded><![CDATA[<!--kg-card-begin: markdown--><p>A story of how fixing warnings broke all my code. It&apos;s all very well obsoleting this stuff, but you have to make sure it&apos;s still compatible...</p>
<p>At some point during the 4.x cycle, <a href="http://unity3d.com">Unity</a> switched the way that a <code>GameObject</code> is activated and deactivated in a scene at runtime. Previously, there was a boolean <code>.active</code> property, and a method called <code>SetActiveRecursively()</code> that could be used. This leads to some confusion: if I set <code>a.active = false</code> then what is the state of a child of <code>a</code>? Assuming that all children of <code>a</code> are automatically disabled too, what&apos;s the purpose of <code>SetActiveRecursively()</code>?</p>
<p>The recent change obsoletes the original members and adds a single place to change state: the <code>SetActive()</code> method. This new method sets the state only on the current GameObject but, as expected, all descendant objects are automatically disabled. Two readonly properties can be queried to examine the active state of a GameObject: <code>activeSelf</code> will return the value set through <code>SetActive()</code>, where <code>activeInHierarchy</code> will return <code>true</code> only if the GameObject and all of its parents are enabled. It&apos;s possible therefore for <code>activeSelf</code> to be <code>true</code> even though <code>activeInHierarchy</code> is <code>false</code> - when a parent object has been disabled - and in this case the GameObject is disabled in the scene.</p>
<p>This is a Good Thing: it makes the API much clearer and more intuitive. However, it&apos;s not quite backward compatible, as I found out after tidying up some code.</p>
<p>I&apos;d addressed some compiler warnings by replacing some obsolete <code>SetActiveRecursively()</code> calls with <code>SetActive()</code>. Suddenly, things went awry: runtime errors spewing through the console. My &quot;quick tidy-up&quot; had broken everything! What had happened is that <code>SetActive()</code> <em>isn&apos;t</em> a drop-in replacement for <code>SetActiveRecursively()</code>, even though the compiler warning suggests the switch.</p>
<p>When you call <code>SetActive()</code>, the change in value doesn&apos;t take place until the end of the current frame. With the old methods, the change would happen immediately. This coul be a hugely important distinction because, in my case, the following line of code was a call to <a href="http://docs.unity3d.com/ScriptReference/Component.GetComponentInChildren.html"><code>GetComponentInChildren</code></a> - which explicitly only returns components from <em>active</em> GameObjects. The call was returning nothing, because the object wasn&apos;t going to be active until the end of the frame.</p>
<p>The available solutions were to introduce a frame&apos;s delay (which gives a whole slew of new problems), switch the call to <code>GetComponentsInChildren</code> (which has an override that will include results from inactive objects, but that can get messy in a complex scene), or continue using the obsolete call. For the sake of not breaking existing code, I had to stick with the obsolete call.</p>
<p>This highlights why backward compatibility in an API can be massively important - and why it&apos;s doubly important to have the behaviours of a public API <em>well-documented</em>: subtle changes can have huge effects!</p>
<!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[Simple, better Unity logging]]></title><description><![CDATA[<!--kg-card-begin: markdown--><p>One of the issues that you can run into when developing with Unity is that logging is quite expensive. It&apos;s entirely possible, if you&apos;re trying to log a verbose amount of processing for one reason or another, you can end up crippling the framerate of your</p>]]></description><link>https://blog.puzey.net/simple-better-unity-logging/</link><guid isPermaLink="false">606f3b55506b170001371181</guid><category><![CDATA[unity3d]]></category><category><![CDATA[performance]]></category><category><![CDATA[game-development]]></category><dc:creator><![CDATA[Dan Puzey]]></dc:creator><pubDate>Thu, 27 Nov 2014 12:21:42 GMT</pubDate><content:encoded><![CDATA[<!--kg-card-begin: markdown--><p>One of the issues that you can run into when developing with Unity is that logging is quite expensive. It&apos;s entirely possible, if you&apos;re trying to log a verbose amount of processing for one reason or another, you can end up crippling the framerate of your game. There are performance investigations I&apos;ve done in Unity that have ended up with the logging being the most expensive portion of the code!</p>
<p>The problem with this is that it&apos;s often painful to remove and reinstate logging code repeatedly: logging is useful, so you don&apos;t necessarily <em>want</em> to remove the code, but you can&apos;t have a performant build with it still in place.</p>
<p>With that in mind, I&apos;ve just published <a href="https://gist.github.com/DanPuzey/669c49f8c321ba447e88">this Gist</a> on Github. It&apos;s a first-pass for something that will reduce the performance impact of logging in game builds, and make it easier to control. Here&apos;s the code - read on for the details:</p>
<script src="https://gist.github.com/DanPuzey/669c49f8c321ba447e88.js"></script>
<p>This code makes less logging calls when compiled as a release build, because the <code>Verbose</code> methods are removed at compile time. They simply don&apos;t exist in your builds.</p>
<p>This class also integrates <code>string.Format</code> capability into the logging methods&apos; signature.* This brings another speed advantage: if the logging method isn&apos;t called, the string.Format isn&apos;t evaluated at all. It&apos;s surprising how much CPU time you can spend on string manipulation for strings that aren&apos;t going to be logged: this code prevents that waste.</p>
<p>The plan going forward is to add code so that the standard <code>Message</code> call is only handled (on release builds) when a command-line parameter is present. This means that, for a typical build, you&apos;d only log warnings and errors. Imagine how much log output - and thus performance - this could be saving...</p>
<p>To go with this, I&apos;d recommend a more advanced console than the one built in to Unity. I use <a href="https://www.assetstore.unity3d.com/#!/content/11889">Editor Console Pro</a>, which gives me custom highlighting, filtering and more detailed call stacks. It has the occasional glitch, but the ability to quickly search and filter logs for the things I&apos;m interested in can save so much time that it&apos;s easily forgiven then occasional hiccup.</p>
<p>I hope this proves a useful tool - please leave a comment if you find this helpful, or if you have any questions!</p>
<p>* If you&apos;re &quot;adding&quot; strings together instead of using <a href="http://msdn.microsoft.com/en-us/library/system.string.format%28v=vs.110%29.aspx"><code>string.Format</code></a> then you really need to change that!</p>
<!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[Making games run faster]]></title><description><![CDATA[<!--kg-card-begin: markdown--><p><strong>Or: why micro-optimising games code often helps.</strong></p>
<p>I was part of a conversation on <a href="http://twitter.com/DanPuzey">Twitter</a> that started with a question about the relative performance of calling <code>.GetComponent&lt;T&gt;()</code> vs having a public field assigned in the inspector. I linked back to my <a href="/benchmarking-unity3d-scripts/">earlier post</a> that includes a profiling</p>]]></description><link>https://blog.puzey.net/making-games-run-faster/</link><guid isPermaLink="false">606f3b55506b170001371180</guid><category><![CDATA[unity3d]]></category><category><![CDATA[performance]]></category><category><![CDATA[game-development]]></category><dc:creator><![CDATA[Dan Puzey]]></dc:creator><pubDate>Wed, 19 Nov 2014 22:41:57 GMT</pubDate><content:encoded><![CDATA[<!--kg-card-begin: markdown--><p><strong>Or: why micro-optimising games code often helps.</strong></p>
<p>I was part of a conversation on <a href="http://twitter.com/DanPuzey">Twitter</a> that started with a question about the relative performance of calling <code>.GetComponent&lt;T&gt;()</code> vs having a public field assigned in the inspector. I linked back to my <a href="/benchmarking-unity3d-scripts/">earlier post</a> that includes a profiling script for Unity prefabs.</p>
<p>In a followup to those tweets, I was asked &quot;should I care about optimising this code if it&apos;s 0.001% of my CPU budget?&quot; My response was &quot;generally no,&quot; but that conversation leads me to note this more subtle point about the performance of games code.</p>
<p>If you&apos;re from a typical &quot;business&quot; programming background, writing data processing and big websites and such, you learn specific things about optimisation. Don&apos;t do it prematurely: there&apos;s no point optimising what isn&apos;t slow. Don&apos;t micro-optimise: saving a millisecond on one line of code isn&apos;t worth it when there are minutes to be saved elsewhere. The first point is still absolutely true in games but, for the second point, what constitutes &quot;micro-optimisation&quot; is a very different beast.</p>
<p>The reason for this is that most code will run in the main game loop, which runs every frame. Much code will also be run multiple times within your scene. This makes trivial wins potentially <em>huge</em> in terms of performance.</p>
<p>As an example, let&apos;s say we have a particular game agent (an NPC) and that agent has a piece of code (an AI logic routine). Assume there are 20 agents active, and our game currently runs at a paltry 20fps. This piece of logic is already running <em>400 times per second</em>. That is a huge factor that makes any gain an order of magnitude better than it seems.</p>
<p>Imagine we can shave half a millisecond off the execution time of that code. It doesn&apos;t seem like much... except we&apos;ve saved that 20 times per frame, and our frame is currently taking 50ms to complete. We&apos;ve just chopped 10ms - <em>twenty percent</em> - off the frame time. That takes the FPS from 20 to 25, with what could be a one-line change.</p>
<p>The beauty of this is that the faster a game already runs, the more these tiny optimisations help, because they represent a larger percentage of the frame&apos;s CPU time. If we&apos;re running at 30fps to start with (as opposed to 20), shaving off that same half millisecond would take us up to just under 43fps.</p>
<p>The basic approach to optimising doesn&apos;t change: profile first, and the attack whatever the worst offender is. The difference with games code is that, often, the worst offender is a tiny piece of code that just happens to be running hundreds of times a second.</p>
<p>That one-line micro-fix can make a <em>massive</em> difference.</p>
<!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[Panicking about Facebook's messenger app]]></title><description><![CDATA[<!--kg-card-begin: markdown--><blockquote>
<p><strong>NOTE</strong>: This post is a direct copy of a reply I made on Facebook a little while back, to a friend who was panicking about the new Facebook messaging app based on <a href="http://action.sumofus.org/a/facebook-messenger/">this</a> overreactive piece.</p>
</blockquote>
<p>Lots of alternatives to FB exist (Google+, MySpace, MSN, etc.), but none are as widely</p>]]></description><link>https://blog.puzey.net/panicking-about-facebooks-messenger-app/</link><guid isPermaLink="false">606f3b55506b17000137117f</guid><dc:creator><![CDATA[Dan Puzey]]></dc:creator><pubDate>Mon, 08 Sep 2014 15:02:47 GMT</pubDate><content:encoded><![CDATA[<!--kg-card-begin: markdown--><blockquote>
<p><strong>NOTE</strong>: This post is a direct copy of a reply I made on Facebook a little while back, to a friend who was panicking about the new Facebook messaging app based on <a href="http://action.sumofus.org/a/facebook-messenger/">this</a> overreactive piece.</p>
</blockquote>
<p>Lots of alternatives to FB exist (Google+, MySpace, MSN, etc.), but none are as widely used, and ultimately they all have pretty much the same privacy rights. The price we pay for being able to use all this stuff for free (and it is an <em>immense</em> amount of technology &amp; resource that powers a platform like Facebook) is that to a greater or lesser extent <em>we are the product</em> - they make their money from what they learn about us, and the advertising revenue that information can generate. The article/petition is an overreaction to a minimal change in the situation that has existed for years with Facebook&apos;s T&amp;Cs (Snopes&apos; response is much more reasonable!).</p>
<p>I would rest assured that the app will <em>not</em> make calls or take photos without your permission (there would be a much bigger uproar than this if it did!) - but it has to request permission to use the camera/your contact list/etc in order to function usefully. The emotions experiment that James mentioned is no different to what every commercial website in the world does: an experiment with new features that shows one thing to 50% of users and something else to the rest, to see what produces the desired result. This testing (called A/B testing, see here <a href="http://en.wikipedia.org/wiki/A/B_testing">http://en.wikipedia.org/wiki/A/B_testing</a> ) is commonplace but, when Facebook/Twitter/etc. do it, it generates panic.</p>
<p>Ultimately, anything you put on a shared service like Facebook or Twitter or otherwise is to some extent &quot;no longer yours&quot; - but most privacy is an illusion these days anyway. If you use a credit/debit card, store loyalty card, mobile phone, any website at all (even without signing in), you are already being tracked, and this isn&apos;t really any different.</p>
<p>Facebook have access to everything you post on the site, but they&apos;re not &quot;reading it&quot; in that there&apos;s no person staring at your every status or photo - but they are almost certainly running a program that scans all your statuses and reports &quot;hey, we should show this person more nappy adverts because they&apos;re always talking about babies,&quot; and automatically analyses photos for obvious obscenities, and suchlike.</p>
<p>I would say: carry on using what you&apos;re using, be aware of the privacy settings (i.e. make sure you know who you&apos;re sharing with), and ask a friendly geek if you have any deeper concerns. Most stories like this are panic-mongering - and often they&apos;re doing it because it&apos;s the most likely thing to get you to click their link, and earn <em>them</em> some advertising revenue. Cynical, I know, but true.</p>
<!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[Unity3d brown bag]]></title><description><![CDATA[<!--kg-card-begin: markdown--><p>Last week at work I ran a short <a href="http://en.wikipedia.org/wiki/Brown_bag_seminar">brown bag session</a> at work on <a href="http://unity3d.com">Unity3d</a>. Whilst my office job (with a HR/payroll systems provider) doesn&apos;t have <em>much</em> use for 3d games technology, my team were all very enthusiastic to get an overview of games development .</p>
<p>I covered</p>]]></description><link>https://blog.puzey.net/unity3d-brown-bag/</link><guid isPermaLink="false">606f3b55506b17000137117e</guid><category><![CDATA[unity3d]]></category><category><![CDATA[game-development]]></category><dc:creator><![CDATA[Dan Puzey]]></dc:creator><pubDate>Mon, 04 Aug 2014 19:53:02 GMT</pubDate><content:encoded><![CDATA[<!--kg-card-begin: markdown--><p>Last week at work I ran a short <a href="http://en.wikipedia.org/wiki/Brown_bag_seminar">brown bag session</a> at work on <a href="http://unity3d.com">Unity3d</a>. Whilst my office job (with a HR/payroll systems provider) doesn&apos;t have <em>much</em> use for 3d games technology, my team were all very enthusiastic to get an overview of games development .</p>
<p>I covered at breakneck pace some of Unity&apos;s basic functionality, and how/where you can add your own code to get things running. Starting from scratch and using the stock asset bundles, I put together a quick (and boxy!) FPS scene with a working gun and destroyable barrels.</p>
<p>I did practise-run for this in a Bitbucket repo that is publicly available <a href="https://bitbucket.org/dpuzey/unity3d-brownbag">here</a>. The code there is almost exactly what we ended up with in the session in less than an hour. Right now it&apos;s provided without documentation, but if people are interested then I could sanitise my brief notes into something vaguely useful...</p>
<!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[Upcoming gigs]]></title><description><![CDATA[<!--kg-card-begin: markdown--><p>I&apos;ve been pretty quiet on the musical front of late, but I have three gigs with three different bands coming up and - in an unexpected twist - these are  open to public consumption! So, feast your eyes on this list and then, if something takes your fancy,</p>]]></description><link>https://blog.puzey.net/gigs-20140714/</link><guid isPermaLink="false">606f3b55506b17000137117d</guid><category><![CDATA[gigs]]></category><category><![CDATA[mango-factory]]></category><category><![CDATA[bristol]]></category><dc:creator><![CDATA[Dan Puzey]]></dc:creator><pubDate>Mon, 21 Jul 2014 20:42:52 GMT</pubDate><content:encoded><![CDATA[<!--kg-card-begin: markdown--><p>I&apos;ve been pretty quiet on the musical front of late, but I have three gigs with three different bands coming up and - in an unexpected twist - these are  open to public consumption! So, feast your eyes on this list and then, if something takes your fancy, come feast your ears too!</p>
<ul>
<li><strong><a href="http://doctorchocolate.co.uk">Doctor Chocolate</a> @ <a href="http://thethunderbolt.net/">The Thunderbolt</a></strong>, Bristol, Friday July 25th. 9-piece party band playing a mix of tunes from the &apos;60s to the &apos;10s. I used to play regularly, but this is a rare dep appearance for me (and an even rarer public gig from the Chocs!). &#xA3;5 on the door.</li>
<li><strong><a href="http://www.facebook.com/pages/Mango-Factory/37968464422">Mango Factory</a> @ <a href="http://www.theoldduke.co.uk/">The Old Duke</a></strong>, Bristol, Saturday July 26th. Currently my main gig, this is a fantastic mix of funky tunes with a bit of latin influence thrown in. Addictive horn lines, soaring vocals, and lots of grooves - all original tunes that you can&apos;t help but dance to. Come on down and hear me playing my favourite music! Free entry.</li>
<li><strong><a href="http://monkeychuckle.co.uk/">Monkey Chuckle</a> @ <a href="http://www.thebein.co.uk">Be.In</a></strong>, Bristol, Saturday August 9th. Described by Craig Charles as having &quot;superb tight arrangements and a real sense of urgency, all you need a great funk band to be,&quot; I&apos;ll be depping a couple of gigs with Monkey Chuckle in the near future. Come catch the first one and find out whether I can keep up with them!</li>
</ul>
<p>The Mangos are playing again on August 30th at The Old Firehouse in Exeter (home of excellent pizza!), and on September 19th at the Grain Barge in Bristol. I also have a second dep gig with Monkey Chuckle on August 23rd at the Landmark Theatre in Ilfracombe.</p>
<p>So, plenty to choose from - please come along!</p>
<!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[Comment away!]]></title><description><![CDATA[<!--kg-card-begin: markdown--><p>I&apos;ve just enabled <a href="disqus.com">Disqus</a> comments on the site: now this at least has a <em>chance</em> of being a two-way conversation!</p>
<p>It was a trivially easy thing to get set up, made even more so by the provision of <a href="https://help.disqus.com/customer/portal/articles/1454924-ghost-installation-instructions">specific instructions</a> for getting Disqus running on Ghost. I&apos;</p>]]></description><link>https://blog.puzey.net/comment-away/</link><guid isPermaLink="false">606f3b55506b17000137117c</guid><dc:creator><![CDATA[Dan Puzey]]></dc:creator><pubDate>Fri, 04 Jul 2014 09:58:04 GMT</pubDate><content:encoded><![CDATA[<!--kg-card-begin: markdown--><p>I&apos;ve just enabled <a href="disqus.com">Disqus</a> comments on the site: now this at least has a <em>chance</em> of being a two-way conversation!</p>
<p>It was a trivially easy thing to get set up, made even more so by the provision of <a href="https://help.disqus.com/customer/portal/articles/1454924-ghost-installation-instructions">specific instructions</a> for getting Disqus running on Ghost. I&apos;m not sure I can add much here other than to say that I&apos;m very mmuch liking all the software I&apos;m running here!</p>
<p>Over the next few weeks I&apos;ll slowly pull the site into some semblance of designed order... keep watching!</p>
<!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[Git credential failures]]></title><description><![CDATA[<!--kg-card-begin: markdown--><p>Over the last couple of days at work, my Git started to play up: every time I tried to interact with the remote repo (for a <code>fetch</code>, <code>push</code> or <code>pull</code>), I got error messages like this:</p>
<pre><code>Failed to erase credential: Element not found

fatal: Authentication failed for https://rhubarb.custard</code></pre>]]></description><link>https://blog.puzey.net/git-credential-failures/</link><guid isPermaLink="false">606f3b55506b170001371178</guid><category><![CDATA[git]]></category><category><![CDATA[fix]]></category><dc:creator><![CDATA[Dan Puzey]]></dc:creator><pubDate>Thu, 03 Jul 2014 14:59:00 GMT</pubDate><content:encoded><![CDATA[<!--kg-card-begin: markdown--><p>Over the last couple of days at work, my Git started to play up: every time I tried to interact with the remote repo (for a <code>fetch</code>, <code>push</code> or <code>pull</code>), I got error messages like this:</p>
<pre><code>Failed to erase credential: Element not found

fatal: Authentication failed for https://rhubarb.custard
</code></pre>
<p>This continued to happen after a machine restart, and affected everything from git-bash to <a href="http://dahlbyk.github.io/posh-git/">Posh-git</a> to <a href="http://www.sourcetreeapp.com">SourceTree</a>.</p>
<p>It turned out that I had <a href="https://code.google.com/p/gitextensions/">Git Extensions</a> installed, which includes a credential helper. I hadn&apos;t realised, but that helper was remembering my login for the Git repo, so I didn&apos;t need to enter it each time. Somehow, the helper had apparently got snarled up and wasn&apos;t working correctly.</p>
<p>It&apos;s enabled through the following lines in the <code>.gitconfig</code> file:</p>
<pre><code>[credential]
  helper = !\&quot;C:/Program Files (x86)/GitExtensions/GitCredentialWinStore/git-credential-winstore.exe\&quot;
</code></pre>
<p>I removed those lines from my <code>.gitconfig</code> and ran a fetch: I had to enter my credentials, but it worked! I then readded the same lines, and fetched again, and my Git was back working just as before.</p>
<p>Hopefully this might help someone else with the same problem - despite lots of search hits, there was no advice that referred to problems with an existing repo.</p>
<!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[Benchmarking Unity3d scripts]]></title><description><![CDATA[<!--kg-card-begin: markdown--><p>As part of my ongoing involvement with <a href="http://www.big-robot.com">Big Robot</a> I&apos;ve been doing a bit of work recently on working out &quot;how best to do things&quot; in Unity3d. Some of the output of that might turn up in a later post, but, since this is my first</p>]]></description><link>https://blog.puzey.net/benchmarking-unity3d-scripts/</link><guid isPermaLink="false">606f3b55506b170001371177</guid><category><![CDATA[unity3d]]></category><category><![CDATA[performance]]></category><category><![CDATA[game-development]]></category><dc:creator><![CDATA[Dan Puzey]]></dc:creator><pubDate>Mon, 09 Jun 2014 22:03:24 GMT</pubDate><content:encoded><![CDATA[<!--kg-card-begin: markdown--><p>As part of my ongoing involvement with <a href="http://www.big-robot.com">Big Robot</a> I&apos;ve been doing a bit of work recently on working out &quot;how best to do things&quot; in Unity3d. Some of the output of that might turn up in a later post, but, since this is my first post in a looong time, I thought I&apos;d break myself in gently.</p>
<p>Part of what I&apos;ve been looking at is what I&apos;d call &quot;general performance issues;&quot; that is, &quot;how should I write a typical bit of code in the most performant way?&quot; Rather than testing by gut, I wanted some reusable code that would help me get genuine results, and so I created a class that would let me profile code running in a scene at a prefab level.</p>
<p>The way it&apos;s intended to be used is this:</p>
<ol>
<li>create a scene, and add a single gameObject</li>
<li>add the benchmarking component to the gameObject</li>
<li>drag a prefab from your project into &quot;Prefab Under Test&quot; in the inspector</li>
<li>run the scene (or run a build, if you want more real-world results)</li>
</ol>
<p>What you&apos;ll see is a long pause, and then console output with the test results; something like this:</p>
<pre><code>PERFTEST: initialise 0.04875484s; run 1.685191s (59.34047 FPS).
UnityEngine.Debug:Log(Object)
</code></pre>
<p>The test creates a large number of your prefab, and measures two things: the &quot;first-frame&quot; time (which includes the time taken to call <code>Awake</code>, <code>Start</code> and the first <code>Update</code> for every instance), and the time taken for a configurable number of frames to render (which measures just your <code>Update</code> calls).</p>
<p>There&apos;s a short (configurable) delay on starting, which ensures that no other startup code is getting in the way of the benchmark, and the instance count and frame count are configurable so that you don&apos;t choke your IDE on too many instances.</p>
<p>I&apos;ve put this on a <a href="https://gist.github.com/DanPuzey/126b284a2b2dde73ae21">public Gist</a> in case it&apos;s useful. Most likely I&apos;m going to add support to test multiple prefabs in succession, for comparitive testing, but I&apos;m not sure how much of a factor execution order might be in such a test - so for now it&apos;s left for each prefab to be profiled as a separate run.</p>
<p>The code in full:</p>
<script src="https://gist.github.com/DanPuzey/126b284a2b2dde73ae21.js"></script><!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[VS Ultimate – leaving a mess]]></title><description><![CDATA[<!--kg-card-begin: markdown--><p>I was poking around my rapidly-filling SSD, wondering exactly what was taking up all the space on my C: drive, when I noticed that the <code>c:\programdata</code> folder was over 2GB in size. That&#x2019;s not necessarily unusual, except that I&#x2019;m careful to redirect most applications that</p>]]></description><link>https://blog.puzey.net/vs-ultimate-leaving-a-mess/</link><guid isPermaLink="false">606f3b55506b170001371176</guid><category><![CDATA[intellitrace]]></category><category><![CDATA[vs2010]]></category><dc:creator><![CDATA[Dan Puzey]]></dc:creator><pubDate>Tue, 20 Mar 2012 12:00:00 GMT</pubDate><content:encoded><![CDATA[<!--kg-card-begin: markdown--><p>I was poking around my rapidly-filling SSD, wondering exactly what was taking up all the space on my C: drive, when I noticed that the <code>c:\programdata</code> folder was over 2GB in size. That&#x2019;s not necessarily unusual, except that I&#x2019;m careful to redirect most applications that would use it to a different drive.</p>
<p>When I looked in more detail, I found that almost all of that 2GB was in one place: <code>C:\ProgramData\Microsoft Visual Studio\10.0\TraceDebugging</code></p>
<p>That&#x2019;s where Visual Studio stores its <a href="http://msdn.microsoft.com/en-us/library/dd264915.aspx">Intellitrace</a> dump files &#x2013; and it seems it&#x2019;s not cleaning up after itself! Some of the files were over a year old, and I&#x2019;m pretty confident there&#x2019;s no useful debugging to be done with them now ;-)</p>
<p>The VS options allow a maximum size <em>per recording</em>, but it seems they&#x2019;ll let the overall folder grow without limit. Worth knowing &#x2013; and worth keeping an eye on if you&#x2019;re short of disc!</p>
<!--kg-card-end: markdown-->]]></content:encoded></item></channel></rss>