Week #485-486-487

Friday, June 12th, 02020 at 15:51 UTC

This is our triple weeknote. Our previous weeknote covered four weeks, so we’re making an improvement to our frequency.

The good news is that these weeknotes aren’t getting written because we’ve been swamped with projects, not because there’s nothing to say.

Week #485

This was the last week in May, so it was also the week we get the month’s finances in order. Luckily, we didn’t have any outstanding invoices. After COVID-19, we’re down to only 3 core customers. That makes invoicing and chasing-up customers much easier. That’s not to say we don’t have room for more project or that we don’t have lots of small one-off-projects that aren’t also invoiced. Activities like writing articles, online workshops and others also generate some sporadic revenue and invoices.

As part of the stealth project, we needed to collect an overlap of retail stores of two companies. We wanted to know how many were in the same shopping mall. We build a quick scraper with BeautifulSoup to grab the listing page, then loop through all the links to get the addresses of the individual stores. Then a quick geo-look-up to get a latitude and longitude. Now we had a spreadsheet of all the locations. Google My Maps allows for a CSV upload. Now we have a simple visualisation to try to find some candidate locations.

Week #486

This week was mostly pair-programming work. We’ve been working on an economy simulator. The idea is to make a state-machine and weights of probabilities to simulate the game play. We can do that 1000 or more times and watch how much in-game currency the players accumulate. With this, we can help determine pricing, how much experience is issues and generally keep the game play both fun and rewarding at an ideal pace.

The first pass of the simulator setup all the possible states and inputs (sources) and outputs (sinks). There were two main problems. Firstly, we had to guess at the probabilities. Secondly, this was modelling an ‘average’ player. With all our experience, the stats for an ‘average’ play don’t actually represent any players. But you have to start somewhere.

The next step was to create a few (we choose four) profiles. Each of these behaved differently (they had different probabilities of doing each action) and were a different percentage of the overall community.

Now running the simulation over different amounts of the profiles gave a better overview of the in-game economy.

The last step was to gather some analytics data from real players and feed those updated probabilities into the simulator. In doing so, we found a few discrepancies in the simulator, but also in issues in the analytics data.

Week #487

The week we continued with the simulator, but from the profiles end. Using some machine learning and k-means mathematic functions, we programatically clustered the users into various profiles. To validate the model, we took a subset of the players and their activities on just their first day of play and ask the model to predict which profile they matched. We then ran the players on their entire play history and asked the model to put them into profiles. To validate the model, we checked how well our predictions were by comparing the first day with the life-time profile prediction. Then we could go back to the original model and look at different features and iterate until we can improve the predictions.

After some coding, we managed to tag all users after their first day with a profile. Having this information allows administrators to better target different player profiles with different offers, options and experience.

This week also had a bunch of work on our stealth project. This is mostly around bug clean-up. As more people get in, we are finding all the edge-cases that we haven’t address.