Monthly Archives: February 2016

Steam’s Automated Refund System Is Far From Perfect

Last June, Valve released its automated refunds system, which utilizes a strict criteria for determining if you’re eligible for a refund. The basic criteria is as follows:

  • You must refund within 14 days of purchase(or 14 days of launch if it’s a pre-purchase).
  • You cannot refund a title if you’ve played for more than 2 hours.
  • DLC and in-game purchases(also called microtransactions) are up to the developer if they wish to provide refunds.

Steam’s full refund policy can be viewed here

This marked an improvement, as previously, the only time Valve provided refunds is when a game is so terrible that the entire community complains. In theory, their new criteria seems like it will cover anything, and is completely fair. However, there are many circumstances that the refund system does not consider, such as:

  • Games might contain their own ‘patcher’, and may take hours to update, depending on the size of the patch and the quality of the user’s connection. Not all games fully update through Steam’s downloader. The time spent patching and updating is still counted as play time, as Steam’s play time counter doesn’t recognize the difference between leaving a patcher open and actually playing the game. Valve could potentially fix this by checking if a window is in focus when counting game time, and allowing developers to specify which executables are associated with patching, and not the game itself.
  • Some games fail to terminate their processes upon closing the game, and continue counting time spent in game. I’ve noticed that some games leave a few background processes running, or keep the app running in the system tray upon exiting. The mighty quest for epic loot is notorious for this- I personally have over 1500 hours played on it because I forget that it hides in the notification bar. My friends think I’m addicted to it, but the reality is that I  actually only play it a few minutes a day.
  • Online games, especially MMORPGS, often encounter server issues or queue times on release day. Time spent waiting in queue is logged as game time. It’s not abnormal to see queue times as high as 10 hours in popular MMORPGs. I personally have experienced queue times of this length in Wildstar, Archeage, and Blade & Soul. It’s understandable to have queue times in an online game during launch, but Valve’s system needs to account for this.
  • Games often come bundled with in app purchases, making the game non-refundable. Rockstar Games received a lot of criticism for this, by creating a bundle of Grand Theft Auto V that came with in game items, then launching a “sale” of the bundle, which cost the exact same as the game normally costs, without the items. By selling it as a bundle with in games items, the game itself could not be refunded.

No Humans, Only Robots

Steam’s refund policy states “even if you fall outside of the refund rules we’ve described, you can ask for a refund anyway and we’ll take a look.”

This sounds promising, but is far from the truth. The reality is that the entire system of contacting them for an exception is a placebo effect. The system attempts to lead the user to believe that someone actually took the information they provided into consideration, which is absolutely not true.

 I personally requested a refund on a title that I only played for 30 minutes, that counted as 3-4 hours according to Steam due to some of the reason’s stated above. I made it very clear in my request that I had only played for 30 minutes, even linking a Stream VoD as picture proof  that the vast majority of the time was spent waiting in queue. Despite this proof and information, my request was denied every time.


Some of my attempts at a refund


Every time I requested a refund, I received this same automated response.

A Bizarre Pattern

When requesting refunds, I noticed a particular pattern. I would first get a notification saying the refund request was received, then 1-2 hours later, seemingly a psuedo-random amount of time after the request, I would receive a scripted response saying that I could not get a refund because I played over 2 hours, even they I entirely proved within my request that I did not. Thinking I received a customer support agent that either had poor reading comprehension or that did not fully understand their policy, I tried at different times in the day to get a different person, to see if it would be dealt with properly.

I also noticed the time of day did not matter. I could request a refund at 3 am Seattle time, where Valve is located, and it would still take roughly the same amount of time as if I requested it on a weekday during normal work hours. Some may think this just means that Valve goes above and beyond to provide reliable, 24/7 customer support. This would seem believable, if it wasn’t for the fact that Valve’s customer support in other departments that aren’t fully automated(such as billing and account support), often takes weeks or months to provide a response.

Overall, Steam’s refund system fraudulently provides false expectations for it’s users regarding the ability to have their case looked at and reviewed. The system carefully waits to provide a response by a set amount of time to mimic the process of a human reviewing the case, then sends its automated reply 1-2 hours later. If you’re within their specified criteria, you’re guaranteed a refund, however if you’re outside the criteria, don’t expect to have an actual employee read your ticket.

If Valve would admit that they don’t review refund requests manually, it wouldn’t be a problem. It’s incredibly frustrating for them to offer a text box that allows you to clarify exactly what happened in your ticket, only for that input to be completely thrown out or ignored. The text box only creates false expectations, and the system would be better out without it. It’s understandable that Valve would want to automate everything to reduce costs and increase profits, however they should not be falsely stating that refund requests are manually reviewed on a “case by case basis”.

If Valve wishes to comment on the issue, please contact us at

US National Highway Traffic Safety Administration Recognizes Google’s AI as a Qualified Driver


The US National Highway Traffic Safety Administration has decided to officially recognize Google’s autonomous driving system as a valid driver in the US. This marks the first move of the government moving towards changing the definition of a driver away from being a human.

For Google Self-driven cars, the machine/software is classified as the driver, and not the human/owner of the device. This potentially shift the liability away from the owner of the car, and onto Google, or the company that develops the autonomous driving software. For this reason, Google will need to make their artificial intelligence near-perfect before their cars or software can reach the market, as the potentially liability and cost from errors resulting in injury or death would be enormous.

Enabling AI Software and self-driving devices to be qualified as drivers marks an important milestone for the development of self-driving cars. By fitting within the definition of existing laws, it can increase the freedom of owners of the self driving cars. For example, laws regarding texting while driving or drinking while driving would ideally not apply to those that are in a fully automated car, as they have no control over the vehicle, and are not considered the driver.

An estimated 94% of car crashes are a result of human error, including poor decision making, bad reaction time, and impairments such as lack of sleep or alcohol. If the artificial intelligence engine used by Google reaches near-perfection, it can reduce the number of accidents substantially.  In addition, 2% of crashes are attributed to vehicle error. A very significant portion of these “vehicle errors” are actually caused by human errors, such as not getting the proper maintanence on tires, brakes, etc. Ideally, a smart self-driving car could detect potential deterioration of parts through the way the car’s motion reacts to input, and notify the user to arrange a repair, eventually locking the user out for their safety if the conditions of the vehicle become too dangerous to drive.

The 2% of crashes resulting from environmental conditions can also be partially attributed to human error. Slick rodes, glare, and other abnormal conditions can be recognized by the driving software, and the software can drive more cautiously according to the conditions.

Google is not the only company to pursue Autonomous driving. Countless automobile manufacturers have pumped R&D funds into this technology, and NVIDIA recently announced their NVIDIA Drive PX2 device that utilizes deep learning to identify patterns of correct driving and visual recognition, and utilize it to make the correct decisions and properly map out the driving scene in a virtual 3d space.

League of Legends’ “Hextech Crafting” Is Being Tested on Turkey’s Servers.

Hextech Crafting, a upcoming system in League of Legends that provides new ways for players to unlock champions and skins by playing the game, has been released in Turkey.

This tactic is used quite often by game developers. By releasing a product to a smaller market before bringing it to larger markets, you upset less consumers in the event of a catastrophic failure, and can either pull the product, or fix its issues. While the system has been tested on the public beta server(PBE), there are quite often bugs and flaws that go unnoticed until reaching live servers.  For example, they likely can’t receive feedback on droprates and rates of progression on a PBE server, as people generally don’t play seriously on the PBE server, and because they made the keys/chests free there for testing purposes. On the TR servers, they will receive real-world feedback regarding the experiences of players that use the system.
Riot’s “Socrates” has stated that they intend to wait “at least 4 weeks” before enabling it in other Regions. This means if everything goes well, we may expect to see it hit other regions such as North America, Australia, and Europe in March.

iOS Date & Time Bug Bricks 64 bit Apple Devices

Users have recently uncovered a rare bug that can brick an iPod, iPhone, or iPad. The bug is known to effect iOS devices that contain A7, A8, A8X A9, and A9X chipsets. All versions of the OS are affected. Trolls around the web are attempting to trick unsuspecting individuals into replicating this process, utilizing fake infographics and other tools to their advantage. If anyone tells you to complete the steps listed below, don’t listen to them!

Steps to Replicate the bug

WARNING: Do not attempt to replicate this bug unless you wish to render your device unusable!

Step 1: Go to settings

Step 2: Select the general category, then date and time.

Step 3: Change the date to January 1, 1970.

Step 4: Scroll up to the year 2000, then go back

Step 5: Go back into time settings,  and set to 1970 again

Step 6: reboot the system, it will get stuck on the Apple logo.

Note: 32 bit devices are unaffected by this bug.
Users have reported a variety of fixes, such as removing and reconnecting the battery/sim card, or simply waiting several hours. Regardless, we don’t recommend attempting to experience this bug unless you are willing to lose your device.

The bug seems to be caused by some form of interaction with Unix time. In Unix based devices, time starts on January 1, 1970, which is the date associated with the bug. Time is measured as the amount of time since midnight on January 1, 1070. 32 bit and 64 bit devices will handle numbers differently. This bug may cause the value written to the time variable to become invalid or reference an invalid part of memory, which can cause it to become unable to boot. Judging by the process also involves setting the year to 2000, it may also involve code written to prevent issues with the infamous Y2K.

Due to the nature of the bug, it’s highly probably Apple will be able to fix it in the next iOS update. As it likely involves an invalid variable, they can change the code to validate user input in a way that avoids them being able to change the date & time in a way that damages the device’s configuration.


Full List of Known Affected Devices:

use Control+f to locate your device

iPod Touch 6th Gen

iPhone 5S

iPhone 6/6 Plus

iPhone 6s/ iPhone 6s Plus

iPad Air

iPad Air 2

iPad Mini 2

iPad Mini 3

iPad Mini 4

iPad Pro

Rumor: AMD’s Zen Processors May Feature Up To 32 Cores

During a presentation regarding datacenter technology and market trends, Liviu Valsan, a computer engineer at CERN, the European Organization for Nuclear Research, revealed some details regarding AMD’s upcoming ‘Zen’ Processors.

The upcoming CPUs are rumored to use a technology called Symmetrical Multithreading(SMT). Symmetrical Multithreading is much like Intel’s hyperthreading, which allows a CPU core to run multiple threads on the same core, which improves performance in heavily threaded workloads. Zen is also stated to provide a 40% improve to instructions per cycle(IPC), relative to their current Excavator architecture, and will support DDR4 memory and PCIe 3.0.


Slide from the presentation


Perhaps the most exciting piece of information is that Zen CPUs may contain up to 32 cores. According to Valsan, the 32 cores processors will actually combine two 16 core CPUs onto a single die, using some form of connection to link them. We’ve previously seen multi-CPU motherboards, but this may be the first case of multiple CPUs on the same processor die.

If AMD is planning to sell 32-core processors, it likely means that they intend to enter the datacenter and server market. The best consumer-grade CPUs currently only contain 8 cores, with those being the AMD FX 8320/8350/8370/9590, and the Intel i7 5960x. With most consumer CPUs used in desktop PCs and laptops containing only 4 cores, It would be very surprising to see AMD release a 32 core CPU to the consumer market. If AMD uses a binning process similar to Intel, it’s possible they may sell 12 or 16 core “Enthusiast” CPUs, much like Intel’s Haswell-e series CPUs, which are binned Xeons with deactivated cores. Besides that, we expect the 32 core CPUs to be designed for usage in servers.

Zen is expected to be available towards the end of 2016, however no exact release date has been announced.

NCSoft Releases Q4 2015 Earnings Report

NCSoft has just released its earnings report for Q4 2015. All of NCSoft’s games have seen an increase in sales from the previous quarter. It is important to consider that consumer sales tend to be higher in the 4th quarter than in other quarters, so this is somewhat to be expected.


NCSoft Sales per game by quarter.

Wildstar saw an increase from 1,727 million KRW(1.45 Million USD), to  2,668 million KRW(2.33 Million USD), an increase of over 50%. As Wildstar recently switched to a free to play model, it’s quite expected to see a surge in revenue initially, due to the increased traffic by bringing in new players. As newly acquired players begin to move to other new releases, Wildstar’s revenue may drop in the future if it doesn’t retain its playerbase.

Guild Wars 2 saw an increase from 20,699(17.29 Million USD) million KRW to 37,331 KRW(31.2 Million USD).  Guild Wars 2 recently launched an expansion titled ‘Heart of Thorns’, while simultaneously providing the base game for free(although with some restrictions until you purchase the expansion). The sales from the expansion(priced between $50 and $100 depending on the package purchased) should have helped push up revenue, as well as the increased acquisitions due to making the base game free.

Blade and Soul’s revenue is up for Q4 2015 by roughly 30% over the previous quarter, however it is unclear if the data includes the founder’s packs sold prior to the game’s launch in EU & NA, as it is much like a pre-purchase. Blade and Soul launched in January 2016 in the west, so with the possible exception of founder’s packs, it should not be reflected in this earnings report.

In addition to increased revenue, NCSoft’s spending also increased by 12% over the previous quarter. A significant portion of this comes from increased marketing costs, primarily from the Guild Wars 2: Heart of Thorns launch.

The PDF of the full report can be viewed from the official source HERE

Riot Games Is Shutting Down League of Legends’ Alternative Game Mode “Dominion”

Citing that less than 0.5% of League of Legends players actively play Dominion, Riot games has decided to announce its removal. Dominion is a fast paced 5 vs 5 game mode that involves capturing control points, and low respawn timers. It takes place on a map called the Crystal Scar. The vast majority of games take place on the map “Summoner’s Rift”, where the entire competitive scene is centered around. As League of Legends punishes players for leaving in the middle of a match, and Dominion matches are generally much shorter than Summoners Rift, Dominion was often seen as the game to play when you don’t have time for a “real” match on Summoners Rift. It also is a nice map for practicing champion mechanics, as the game mode doesn’t “snowball” as much, and respawn timers are low, meaning you get nonstop action, rather than sitting in a lane farming half the time.


Dominion is a 5v5 game mode where teams compete over control points


On newer regions with smaller playerbases, Riot has already chosen either to not enable Dominion at all, or only have it available during peak times, such as weekends. Dominion has been considered dead for a while, receiving limited attention in regards to bugs and issues, as well as limited support within new systems, such as Champion mastery and ranked queues.

Dominion will be permanently disabled on February 22, 2016 for all regions. Dominion will not be available in custom games or through their queue system. Any players who have won over 100 games in dominion will receive an exclusive, permanent icon for their efforts.

As Dominion has different objectives which create different interactions with champions, supporting the game mode may have limited Riot’s design potential for new characters. While champion balance and designed is centered around Summoner’s Rift, Riot has made game-mode specific balance changes in the past when a particular champion is too strong in an alternative map.

While Dominion is being permanently disabled, it is unclear if it will ever return as a temporary game mode. Riot periodically releases temporary game modes with unique mechanics. These modes are generally part of an event and are available for 2-3 weeks. Quite often, it’s novelty ones like U.R.F, a mode where all cooldowns are reduced by 80% and all mana costs are taken out, or one for all, where you can play 5 of the same champion on one team. However, we have also seen unique temporary game modes such as ascension, a mode that takes place on Crystal Scar, the same map used in Dominion, but with a different objective. We will have to wait and see if Dominion ever makes a return as an event.

Amazon’s Free ‘Lumberyard’ Engine Helps Indie Developers Create MMORPGs

Creating An MMORPG is a very long and difficult process. It is arguably one of the most expensive types of games to create, as it requires developing a client, a back end server, and writing the proper network code to allow both to communicate, while still maintaining proper security. On top of development costs, MMORPGs are also expensive to maintain, as they require maintaining infrastructure & servers, as well as customer support for players.

Amazon seeks to reduce the barrier of creating online games by providing an engine for creating multiplayer games with ease, as well as utilizing their AWS service for hosting servers. This allows developers to focus on designing & building their game, rather than worrying about the challenges of creating an MMORPG engine, and maintaining the infrastructure for hosting a game that will be played by millions of players.



Amazon’s Lumberyard editor. Image source:

Amazon’s Lumberyard Engine is built off of technology from Cryengine, Double Helix, and Twitch. Lumberyard is cross platform, meaning developers can simultaneously develop for both PC and consoles. PC, Xbox One, and PS4 are currently supported, with Mac, Linux, iOS, and Android coming soon. Lumberyard is currently in its beta stage.



Utilizing Cryengine’s Technology, Lumberyard can be used to create breathtaking visuals. Source:

Amazon has released the source code for Lumberyard, which allows you to tweak the code if necessary, if their code lacks any functionality you deem necessary, in order to build your game.

According to Amazon, the only fees involved would be if you choose to host servers through them, where standard AWS(Amazon’s Cloud Computing Service) fees apply. Lumberyard is integrated with Amazon Gamelift, a pay as you go service for hosting game servers, and dynamically scaling their capacity as needed. This can prove very useful for MMORPGs, as you’ll likely need increased capacity during launch or after major updates, but may not want to invest in such capacity when the game doesn’t demand it. Lumberyard can also be used to create single player games. For information, visit the official source at


Using Lumberyard to Create Sandbox MMORPGs as an Indie Developer

Sandbox MMORPGs require less development time than theme park MMORPGs to create, because players are effectively the content. In a sandbox MMORPG, less time needs to be spent creating quests, large game worlds with detailed cities, and other time-consuming projects, because in a sandbox game, the players create them.

Historically, this does not mean creating a sandbox MMORPG is easy. Existing MMORPG engines tend to be very geared towards themepark MMOs, and lack the functionality needed to create a dynamic, changing game world. For this reason, those hoping to create sandbox MMORPGs still have to create a game client and server from the ground up, which is very expensive and time consuming.

Amazon’s Lumberyard, on the other hand, is free, highly customize-able(due to released source code), AAA quality, and is integrated with their hosting service, making it easy to handle infrastructure. This means that talented developers with limited budgets may be able to create a quality sandbox MMORPG, due to the reduced barrier of entry.


Why Instructions Per Cycle is a Vague Performance Metric; Distinguishing Integer and Floating Point Speed

You may be familiar with the GHZ myth, the idea that the speed of a microprocessor is directly measured by its clock speed. In the 80’s and 90’s, the increase in performance was primarily seen through increasing the clock rate of the CPU, and thus the clock rate became used as a measure of performance. To address this fallacy, semiconductor companies such as AMD began naming their products according to what they believe was their “effective” clock rate, rather than based on the actual clock rate. For example, the AMD Athlon 64 3200+ actually runs at 2000 mhz, but it is marketed as 3200+ because AMD’s marketing team felt that it better reflected the CPU’s actual performance.

The GHZ Myth is disproven by the fact that the complexity of each clock cycle varies between different CPUs. A CPU clocked at 3 GHZ will complete 3 billion cycles per second. The Instructions per Cycle, or IPC, is a measure of how much the CPU gets done within one cycle. While GHZ measures the rate that cycles are completed, GHZ does not account for how complex those cycles are. For this reason, consumers began using benchmarks to measure IPC in order to make intelligent purchasing decisions.

One thing many people don’t realize is that modern CPU cores contain both an integer unit, and a floating point unit. In computing, an integer is essentially a whole number. A float, to put it simply, is a number with a decimal value,  that is not 100% precise.

Certain programs may make heavier usage of integers, whereas others make heavier usage of floats. For example, for execution and logic purposes, programs will generally use integers, because they’re much more reliable. Consider the following code:

float a=5;

float b=5;


print(“a is equal to b!”);

With a float, for example you don’t know if a actually equals 5.0000004, or 4.9999996.For this reason, comparing float values becomes much more tricky, which is why integer logic is generally used for execution purposes. Where floating point logic comes in very handy is for complex scientific calculations and simulation, graphics rendering, as well as in some games. Scientific calculations will generally use doubles for ints, which are a data type much like a float that is twice as precise.

The speed of the integer unit and the speed of the floating point unit are not always directly correlated. For this reason, benchmarks that attempt to measure IPC will vary significantly depending on the logic of the program. Some programs may primarily stress the integer unit, whereas others may rely more heavily on the floating point units.

When measuring true performance of a CPU, it’s best to measure both the integer unit and the floating point unit individually, with code that is designed to test the limits of each unit. We highly recommend the site, as they provide tests for the integer speed of a CPU, the floating point speed of a CPU, as well as a mixed benchmark. We are not affiliated with them in any way.

By determining both the integer speed and float speed of a processor, it becomes substantially easier to recognize the differences between processors. Mixed benchmarks are still important, as they’re more representative of a real world scenario, but to get accurate figures and truly understand a processor’s potential, it’s best to measure the 2 units individually as well.



According to, the i5 6600K is 3% slower than the AMD FX 8350 at integer calculations, when using all cores available. However, for floating point calculations, it’s actually 20% faster. Because the AMD FX 8350 only has 4 Floating point units, which are shared across 8 cores, the Intel i5 6600K pulls ahead for floating point math; however, when all cores are utilized to the fullest extent, it still falls behind the 3 year old AMD FX 8350 at integer calculations.  Where the i5 6600K pulls ahead is in its single threaded performance, since it has fewer, faster cores. The i5 6600K is usually the better processor(and also significantly more expensive), but for some niche applications that are well threaded and use primarily integer logic, the AMD FX 8350 can actually do a better job for a lesser price.


By researching both the the single-threaded and multi-threaded integer and floating point benchmarks for a processor, and knowing the way your applications utilize the CPU, you can make the best purchasing decisions by choosing a processor that best suits your needs.

AMD: Does “Double the Power Efficiency” Mean Double the Performance?

AMD’s CEO Lisa Su has stated that AMD’s Next generation of GPUs, codenamed Polaris, will potentially feature twice the power efficiency of their current products. This has caused a lot of concern among enthusiasts that the next generation of graphics products may not offer the level of raw performance that they hope for. With Intel recently announcing that their future chips will be slower but offer greater power efficiency, it certainly may seem that way, as the market is moving towards power efficiency.

Factors to Consider

It’s difficult to interpret what Su means by “double the power efficiency”, especially because she didn’t give a specific card as a reference. The Fury Nano is already very power efficient, so doubling the power efficiency of that would be outstanding, whereas a card like the R9 390x being doubled in power efficiency would not be that impressive at all, considering that the Fury nano is already 1.7x more power efficient than the r9 390x.

One thing to consider is that “power efficiency” is effectively an investor buzzword. With how much value there is in phones, tablets, and laptops that can last a long time on battery, power efficiency is very appealing to investors as a term. For this reason, stating that Polaris will be twice as powerful carries less weight to investors that stating that Polaris will be twice as power efficient.

The last thing to consider is that in the discrete GPU market, with the exception of the Fury Nano, AMD is a lot less power efficient than their competition. AMD in recent years has developed a bit of a reputation for running hot and consuming a lot of power. For this reason, it is reasonable to believe that AMD is aiming to improve this.

My Personal Prediction

With all of this information taken into account, my prediction is that the Polaris flagship card will be 50% more powerful than the Radeon Fury, while consuming 25% less power. As AMD will likely be stuck with a 14/16 NM manufacturing process for a while due to how increasingly difficult it is becoming to shrink semiconductors, they likely want to leave room in the future for a high TDP, high end card. If you look back in 2012 when they launched the 7000 series cards with 28 NM technology, and compare it to the r9 300 series cards, you’ll see how they utilize their room for growth. The r9 390x is very much a brute force “throw in more cores” effort to offer more power on existing nodes. By keeping the initial Polaris cards at a low power consumption, they create room for growth, enabling them to release more powerful cards in the coming years.