Design a site like this with
Get started

Bats, Matt Damon, and Simulations: Blurring the Line Between Reality and Fiction

In a world of fast cars (Tesla) and even faster information flow (Google), the world has yet to properly know what to believe when it comes to news – nothing more obvious than what is happening with the Coronovirus.

In 2011, a movie about the far-reaching impacts of a bat virus that brings the entire world to its knees both through the virus itself and the intense fear that perhaps causes more damage was released in theaters. Directed by innovative director Steven Soderbergh and brimming with sharp writing, multiple storylines, and committed performances, audiences were shown an almost-uncomfortably-realistic look at what could happen just a couple months after a virus begins to burrow its way into humanity’s core.

Within 48 hours, it claims one life.

Within weeks, it claims scores more, all whilst our main characters desperately try and not only find viable information about the virus but also save any shreds of humanity left as society breaks down and body counts shoot up. 

While it grossed a healthy amount of the box office, it also led to many discussions about the film’s many realistic assumptions: what are the symptoms of the virus? How does one protect oneself from infection? When will a vaccine be created? And just how many lives will be lost before something can be salvaged?

A year later, a game called Plague, Inc. was released on mobile devices and it was immediately evident that it was going to have the same effect as Contagion; however this time, the audience was in the driver’s seat. Powered by research conducted by the developers and taking into account the many ways diseases can find their way into our daily lives, Plague, Inc. was a monster hit, in all ways possible. Both gamers and critics took note of its realism and the literal hundreds of possibilities present, including antibiotic resistant, airborne-spread, and nerve crippling strands of a virus.

The next year, one of its developers gave a speech at the Center for Disease Control and Prevention, where he discussed the power and usefulness of mere “video games” to help with real world issues. In a post-Swine-Flu world, senses were heightened when it came to hygiene and protection from germs (in a matter of speaking), which lead to some improvements in global health. But as proven by the recent Coronovirus outbreak, nothing is safe for too long.

What is ultimately more surprising than the virus outbreak itself is the increased attention that is being paid to the movie and video game mentioned. Within a couple of weeks, Contagion found itself rocketing to the top 10 most rented titles on the iTunes store. Similarly, Plague, Inc. (along with an updated version for modern consoles and PCs, Plague, Inc. Evolved), saw massive upticks in player counts and sales, with the majority of players in China itself, desperately trying to find a way to predict the virus’s trajectory.

On top of all of this, social media was being flooded with numerous bits of information, a majority of which was untrue and meme-filled. The developers for Plague, Inc. got so many questions and activity that they diverted all possible inquiries to a link for the CDC and other institutions for help.

Despite all of the chaos, sadness, and downright horrible things that the world is going through, the fact that we can relate to works of fiction to guide and inform us through real-world events should be admired and noted down for all to see. Being able to relate to make-believe scenarios goes beyond what has already been said of artists and filmmakers (and video game developers). Though not all of us haven’t had the chance to create something for others to see, it is important to realize that thinking outside the box and expanding our horizons is in fact more effective than basing all of our thoughts on static information that was searched last minute on a smartphone.

The ability to think, believe it or not, plays a greater role in our lives than we assume. Accepting facts as they are and not taking the time and effort to wonder for ourselves is a crippling weakness in the otherwise relatively clean human mindset. Films and video games are allowed to have a margin of error because they are essentially opinions, eloquently drawn out and expanded upon to encompass and encourage unique thinking processes.

But this can only be done properly and with full effect if it is well-reported and put above the everyday, frankly-trash news that goes in one ear and out the other. It all boils down to what we deem as “the best” news to follow, and that can only be done if the filter is in check and the world is given the news it deserves, not what it needs (a tad cliche, indeed).

The next time a work of fiction is deemed “realistic”, maybe it should be treated as such and talked about in more lengthy terms – because not to sound like a downer, but the next big event could happen at any time.

Here’s to the tireless journalists, filmmakers, and video game developers who make it just a little bit easier to live on Planet Earth.

Photo Credit: BBC

#contagion #mattdamon #bats #earth #plague #gaming #film #realism #coronavirus #wuhan #china #news #realworld #simulation #2020 #future #thinking

NVIDIA’s GeForce NOW: Emphasis on the NOW

After many years of beta testing (and patience), NVIDIA’s own game-streaming service is now official and off the beta-bench – and looks to crush all known competitors due to its surprising price point and innovative features.

Out and ready to roll: GeForce NOW is offering both free and paid ways to access its game-changing technology.

2 years. Not counting the time when it was known as NVIDIA Grid, a Netflix-type subscription model for PC gaming that was discontinued and sent back to the drawing board, that is how long the modern version of GeForce NOW has been in beta (since 2018). The pitch? Imagine taking remote control of a $6,000 Tesla graphics card (an expensive beast of pure gaming power) through the cloud to play games that would never even start on your low-end, $300 laptop. Imagine not having to worry about lengthy 30-minute download times, instead having your games ready to go in the blink of an eye – literally. Then imagine doing this wherever a (albeit) strong WIFI or Ethernet signal exists, then sitting down and enjoying 1080p HD streaming quality that (almost always) runs extremely well.

During my years at college, I found myself as one of the lucky people who, after signing up for a waitlist for the beta and anxiously waiting for two months, found that GeForce NOW is a major game-changer – then proceeded to invest hundreds of hours logged onto the service, exploring just how far this technology could go. Like the title itself, NVIDIA’s software makes it supremely easy to begin using almost excruciatingly difficult to put down (except being booted off after staying on for four hours straight).

More importantly, however, this service proved to its customers and rivals alike that harnessing the power of the cloud could level the entire playing field between those who had an expensive gaming rig and those who could not simply afford the expense. Speaking as a person who used this technology frequently, I would always question myself as to how long this type of service would be in *ahem* a free state. Given how Google Stadia released its own game-streaming service (to mixed reviews and a lack of support for its small group of users) and even Sony (PlayStation Now) and Microsoft (xCloud) are creating their own streaming services with prices in check, it was only a matter of time before NVIDIA would put a number on its value.

Millions of people have already gotten onboard with the idea of cloud gaming – and there is a lot of exposure to be taken note of as shown by device usage above.

And it finally has. As of one day ago, GeForce NOW is official and ready to roar onto scores of devices (yes, mobile devices are also compatible with GeForce; though only Android devices work) with not only a premium tier, but also a free one. Even though the free tier has short session lengths and those who use it are given less priority that those who are premium members, it is a wonder that in this day and age, a software as functional as GeForce NOW can be offered for free for anyone who has even an inkling of interest in it. And at $4.99 per month (at least for now), the premium membership could be a no-brainer, especially for those who have been using GeForce NOW for the past couple of years.

Touching on the financial impact that this could have, there is no way of telling just how deep of an impact this service may have. Considering how transcendent this technology really is (it is in gaming, hardware, and streaming all at once), competitors like AMD, Google and Microsoft could maybe seen as inferior when taking GeForce NOW’s maximum capabilities into consideration. NVIDIA deserves a round of applause, however, for its official rollout of its service and its continued dedication to the future of gaming and supporting an entire community that enjoy the product for what it is and what it represents.

This kind of news, frankly, makes me happy to look at both from the perspective as a customer and as a businessperson. Giving something a couple of years a good, slow cooking time and taking all sorts of feedback and improvements into account is one of the surefire ways to get something off the ground like how NVIDIA is handling GeForce NOW. It could possibly lead to a swift takedown of the struggling Google Stadia and if given the proper time and coverage that it deserves, it is entirely plausible for the service to reach juggernaut results and completely upend the way gaming is done today.

Photo Credit: AnandTech

Making Your Opinions Matter in 2019

The idea of communicating thoughts effectively and precisely has been modified to fit our modern day needs.

Not all communication requires the use of hand gestures and facial expressions. Source:

One of the first articles I wrote about dealt with communication and how to maintain efficient and strong communication with those around you, especially for those of us (like myself) who will be soon working in the business industry, where being able to communicate well to your boss or coworkers can matter more than whether or not you were part of the all-star debate team in college. Something that I had been noticing for a while, both in my college and outside of it, was that it doesn’t really matter if one’s resume is outstanding and unique in terms of academic prowess; if you are able to speak the right way and – more importantly – listen – to others, you will find that doors will begin to open to usher in new opportunities. Everyone likes a person who can talk, but everyone LOVES a person who can talk, listen, AND converse with others.

There are TWO important tips to improving communication:

1. “Lend me your ears”

Marc Antony kindly asks for your attention. Source:

Though the circumstances for Marc Antony’s speech during the climax of Julius Caesar were nowhere near positive, the line uttered by Antony still has context even today – just not in the way most people would think. It all comes down to one thing – initiative. Without it, one would not be able to contribute productively, lead others with visions, or even progress in relationships, be it with immediate family members or total strangers. And it isn’t like we are lacking in initiative or potential to take the reins of life: we all have some part of us that wants to take control of our journey.

Part of the reason why there are those with initiative and those without is due to the lack of openness that many of us, due to the many different things that we can stick in ears as well as cover them up (namely AirPods, headphones, and what have you), it is easy to get caught up in our own worlds and become lost in a world of comfort. Sure, it is great to occasionally lose yourself in a familiar world of pop music or gritty TV shows, but it is important to disconnect from that world and explore what reality has in store for you; no one got anywhere important or significant without doing a little trial-and-error exploring here and there.

2. Ask, ask, ask…

Here’s Snoopy asking the real questions here. Source: Pinterest.

Other than asking the occasional food question, Snoopy proves that it is important to think about even the most simple things – and slap a question on it. Same goes for everyone else – if there is even the most slightest doubt about something, go ahead and question it. There is never “a dumb question”, and the worst question “is the one never asked” but let’s be honest here: no one was born with the database the size of Wikipedia. During anything from an elevator pitch to a job interview to even dinners with the family, asking questions shows that you are not only engaged with what is going on around you but also you are genuinely trying to give an effort in your current task.

Asking questions is one of the most underrated tools of efficient communication; today’s age sees the massive usage of search engines such as Google to find the answers to questions on a daily basis. Asking a real-life person who has had experience in a certain subject is just as rewarding as tapping away on the Internet and the ability to properly articulate the questions and converse with a single individual will show others your capability to communicate in a smart and time-efficient manner.

There is something to be learned when communicating with someone, regardless of how different they may be from you. However, with the right planning, tips, and mindset, anyone can engage in proper and beneficial communication to propel themselves into a higher standard of themselves.

The Future of the Dollar Bill

With the recent surge in Bitcoin, and the existing interest in non-physical currencies, it may be possible that paper money will become a thing of the past – very soon.

“Dollar dollar bill y’all”…or, perhaps in the near future, “Bit bit coin y’all”.

You may have already heard about the recent (a word that I am using in a relative sense) increase in the value of Bitcoin, which, for those of you who aren’t familiar with the notorious “virtual currency”, is a type of currency mined with powerful computer rigs, not iron pickaxes, and can be used as normal payment methods online, be it eBay or even through the service known as PayPal. Using, mining, and managing BitCoin is the type of surreal experience that only few people have learned and/or experienced as of now, but could very well become our future.

The coin. The stone. The feather, the pearl, and, of course, the gold bar. What has been listed, in addition to perhaps any particular item found around the world, has been recorded to be represent some kind of currency at some point in human history. The way that we used these currency types to conduct business transactions was a miracle in the way we implemented the current market system and moved past the older ways of bartering. In today’s modern world, we are using a mix of metal coins, paper bills, plastic cards, and, even, our very own smartphones to pay for our everyday expenses and other purposes.

What has come with the passage of time are new and different ways of purchasing everyday goods and services. The way virtual currency can be used and the variety of applications that it can be utilized is expanding at a slow, albeit deliberate, pace, and it could in the near future that many of the people on this Earth will begin to use virtual currency for their daily transactions. Contrary to what has already been reported on, virtual currency is still being developed and monitored, as can be seen by the recent uptick in the price of BitCoin.

With each passing year, new technologies and ideas are given birth and grace as they are analyzed and utilized by researchers and other people alike. In the case of BitCoin, and its quite frankly variable value and history, one doesn’t really know what the future can hold for something as valuable and aggressively unknown as BitCoin is.

Image Source: Deposit Photos

Why Cloud Gaming is Here to Stay

With Google’s cloud gaming streaming service Stadia announced and slated for a November 2019 release, the future of gaming may already be here.

Imagine exploring Ancient Greece and conquering history…from almost any device you own.

It all started with the console. The NES, SNES, Gamecube, and Nintendo 64 are – rightfully – revered for their classic game libraries and profound impact on how video games grew in popularity before the 2000s. It wasn’t uncommon to see children clustered together playing titles such as the Smash series and attempting to rush through the levels of the unique and revolutionary Super Mario Bros. and its spin-offs. The desktop computer (PC for the millenials) eventually learned some tricks and began to pump out ground-breaking games of its own time, including DOOM, Baldur’s Gate, Command & Conquer, and even the first Diablo. Competition existed, yes, but in the end, all gamers were happy to be living in a time where their gaming habits were fully fulfilled by both console and PC.

Fast-forward to 2019, and it is a whole another ballgame. Welcome to the era of 4K-capable consoles that double as breeding grounds for cutting-edge technology that deliver joy to the gamers and garner praise from the critics in multiple, acclaimed titles including Spider-Man, God of War, and Super Smash Bros. Ultimate (the three of which are, ironically, essentially remakes of their successful, pre-2010s predecessors). PC gaming has become a behemoth of an industry, with companies such as NVIDIA and AMD competing to deliver the best graphics for the latest AAA titles; others including Valve, CD Projekt Red, and even Humble Bundle strive to provide the best value for your money, with infamous sales being held almost constantly in a hilarious fashion.

Fast-forward to 2019, and it is a whole another ballgame.

However, there is one evolution of gaming so threatening and potentially game-changing (pun intended) that both console and PC companies will be quivering in their boots if given the chance: the cloud. In recent times, the cloud (which is the virtual space in which terabytes and yottabytes – not dinosaurs – of data are stored and accessed daily by millions of people) has become a force to be reckoned with, as many companies find that storing useful information in the cloud is more helpful and accessible than using storage servers or USB sticks (a relic indeed). Gaming through the cloud is a relatively new concept, something that even gamers themselves have doubted for a while now: how can one game without either a console or a NVIDIA-powered laptop at the standards that have been set today? The answer may lie above us – literally.

Two services have been introduced that capitalize on using the cloud as a gaming platform, GeForce Now (created by NVIDIA) and Stadia (from Google), the latter of which has pre-orders and release dates all lined up. Essentially, the concept boils down to streaming a game from a tailor-made, ultra-powerful gaming PC to any device of your choice, almost like an interactive YouTube video. Depending on how powerful your internet connection is, the image and audio quality (as well as latency and feedback) will drastically improve, leading to Google’s claim of 4K, 60 FPS gaming on an ordinary Chromebook.

The answer may lie above us – literally.

Although many may still point out the use of PCs in this situation, it should be made clear that the user in question won’t have to splurge on a $1200 gaming machine; all he or she needs is a subscription plan to use a PC through the cloud, all ready to use, and voila. Google has even put a price on its monthly plan, a mind-boggling $10, which can leading to hundreds, or even thousands, of dollars in savings for your average PC gamer. Perhaps this is a sign that the times are changing; for the cash-strapped gamer, all the more better.

Image Credit: Techcrunch

The Future of Digital Music

Apple’s rumored move to remove iTunes permanently suggests that subscription-based music models are here to stay

Kind of hard to decide where your money goes, right?

By Shaan Bisht; Published on June 2, 2019, 6:29 pm.

Since the early years of Pandora Radio and Spotify back in the 2010s, the idea of streaming the music that you wanted to hear again and again, rather than buying for every individual song that you wanted, was still being developed. After all, streaming music works in two ways, or more specifically, for two groups of people: the customers and the artists themselves. Both want to get the maximum value out of a single stream of revenue: the customers want to listen to their favorite music and support the artists without having to expend a high amount of clams while at the same time, the artists want to get their paychecks and do right by their fellow fans. It is difficult to get everyone what they want, especially with multiple controversies over the past years that dealt directly with money.

For example, in 2017, Spotify came under heavy fire from an artist by the name of Taylor Swift, who sought equal compensation to people both big and small in the music industry and targeted the broken monetary system that the Swedish streaming company had enforced since its inception. The situation escalated when Swift took her entire music library off of not only Spotify but other streaming services as well, until a settlement could be reached and the income would distributed in a more proper manner. This is where understanding how everyone gets paid through streaming works, and how the rumors surrounding Apple’s removal of its long-used iTunes service can spell an end to an era.

Unlike services like iTunes, where customers would buy music and that money would be then distributed between the parent company (ie. Apple) and the artist and representation, companies like Spotify pay themselves and the artists based on how popular a song is and the number of streams that the song has. This is a similar model of how radio stations generate revenue and given the staying power of radio (aka a much longer time than iTunes itself), the question doesn’t come down to whether or not money will be generated (spoiler alert: it will. A LOT.). With more than 8 prominent music streaming services (like the ones above) already existing in the world, one might stop to ponder why Apple would even bother pulling its iTunes services out of rotation.

On that particular matter, it simply boils down to a couple of reasons: 1) Apple has already had its own music streaming service in operation for a while now (the mixed bag called Apple Music) and 2) not many people use it in general. For me personally, I find it hard to sit down and manually transfer music and movies through a quite frankly agonizing process involving cables and slow transfer rates. Perhaps it is because that I have become spoiled with the new generation of technology, what with the easy downloading of playlists from Spotify and crystal clear music quality from any streaming service. But maybe, with Apple letting go of its once-premier media center service, it will be able to focus on something else for the near future.

Image Cred: