Saturday, December 5, 2020

December Update

After a long bout of sickness, I have returned to the world of the living. 

-Did some remodeling of the blog: the UVG Digital DM Screen now has a home here! I've created tabs for it and the Mothership Character Generator so they're easier to find. 

-Progress on the Discord Games continues. I've been hesitant to blog about my more experimental projects, but I think it could be interesting to record my various adventures and foibles, focusing on the process over the results. Is this something you would be interested in seeing more of? 

-Just got my copy of Gradient Descent, which I was completely blown away by. May consider doing a review. 

-I trained my first little Neural Network for class this week! It can categorize blog posts by subject with a pretty decent level of accuracy! Going to be doing a second round of testing this weekend to see if I can get that accuracy up.

-I've been trying to sum up my most recent project, Interdimensional Voyages, into a blogpost, and it has been very difficult. I've always been more inclined to forge ahead into the future rather than take stock of past projects. This challenge will go on the to-do list. 

-One of my pre-New Year's resolutions is to blog more and be more active on discord. So we'll see how that goes! Expect more posts soon. Wishing you all the best, 

-Saker

(P.S. Check out Max Cantor's awesome kickstarter for Maximum Recursion Depth! It's got 8 days to go as of writing this post!)

 

4 comments:

  1. Thanks for the plug :)! The game is fully backed with all stretch goals, which is mind boggling, but certainly I'd love to see even more support in this final week, it will give me more flexibility and allow me potentially to add even more to the game, even if all stretch goals per se have been met.

    I'm glad to see this update. Obviously I've got my hands full at the moment, let alone also work and life stuff, but if there's anything small I can do to help with blog posts or organizing your thoughts around games and settings, etc., please let me know.

    I backed the digital version of Gradient Descent but still haven't read it. Now that you're a neural net guy, it may not be too surprising, but the name of that setting was very loosely an inspiration for the name Maximum Recursion Depth ;).

    Do you have the code on GitHub? I'd be interested to see what you've done, or for you to report on it on the blog. I'm more on the engineering side nowadays, but engineering for ML, so I'm always trying to keep one toe in that pool. I had intended to train a model like what you're describing as well, but never got around to it >.<.

    ReplyDelete
    Replies
    1. I will definitely take you up on that offer!
      As for the neural net, I was using it as a data model through Rapidminer for class, so I learned how to train and test it without having to deal with the code. Of course, now that I've gotten hooked I've put myself through a self-directed class on NNs and am going to be trying to make my own (through either Tensorflow or Pytorch) when I have the time!

      Delete
    2. I am not familiar with Rapidminer, sounds interesting. I don't have a ton of first hand experience with tensorflow or pytorch but data scientists on my team use both (but I think mostly tensorflow) so both are good choices, but I think tensorflow is a better representation of industry ML, whereas I think pytorch is a bit more academic in nature, but those lines are probably blurrier than I'm making it sound. Even though fewer people use it necessarily, you may want to consider spark ML, if for no other reason than as an excuse to get comfortable with spark, or for cluster computing more generally. It's easier to learn ML when you already know how to code with big data ;).

      Delete
    3. Looks like Spark ML and Tensorflow will be my first escapades! Thank you for the guidance! C:

      Delete