What shall it profit a man if he gain the world - and lose his soul?
Metropolis (Fritz Lang)
In this blog post we answer a few of the questions @spacesjut was kind enough to ask us. If you have any questions you would like us to answer in a blog post please send them to firstname.lastname@example.org
The player is able to delete chunks of meatbags’ conversation. How does it work? Does it work in retrospect or only during “real-time”-surveillance? Does all communication work like the example, or are the meatbags able to talk in person and realize that messages go missing?
When an NPC sends an email there is a slight delay before it is delivered. This is because all emails are ran through the corporations off-station ‘sixth eye’ surveillance systems. The player can take advantage of this to change or delete the message before it reaches it recipient.When receiving an email the NPCs don’t always check straight away, you can use this gap to delete or change the email.
It’s worth mentioning that each NPC has a unique personality, if you send hugely emails that deviate from the NPCs personality repeatedly suspicions are going to be raised.
If I start a new game, will I have to go through everything again?
We have two game modes ‘Story’ and ‘Survival’. There are lose conditions for both. When you start a new game the station is reset back to it’s default state and a new crew is generated.
However, each NPC has a personal identification number you can use to ‘hire’ them onto a new station if you miss them.
Can the player take control over a maintenance bot to repeatedly sabotage the central computer system and use the common failures of the system as an argument to get the meatbags to give me control over this error-prone system?
If you run around smashing things with a maintenance bot it’s much more likely they are going to take the maintenance drones in for repair and analysis then hand over more control to the central computers.
For maximum control you want to demote intelligent engineers, promote the incompetent and slowly manipulate people into opening system vulnerabilities for you.
It’s a balance game though, full control isn’t necessarily the best way of reaching you end goals. Sometimes you are going to want to let your favoured people take more control if you are playing the long game.
The Station is shown to have gravity. Is it on a celestial body with a sufficient mass to provide said gravity? If not, how does the station provide it and can it be switched off? Are you just handwaving these things or will there be large amounts of technobabble trying to explain why such things work?
Oh, there will be techno-babble. The station was originally designed to spin and have spin gravity. It seemed a little odd you could control all systems other than the gravity though so it was cut. All that floating around adds a lot more work for our small team.
Unless we come up with a better idea I am debating having the station fixed to a small dense moon or asteroid that orbits the planet to account for this. It’s still up in the air though.
Will the player’s drone be the only active bot? Or will there be maintenance bots, military bots, …? If so, is it possible for the player to actively push the automation of certain tasks to get them out of the hands of the meatbags?
You have the surveillance camera drones, these can interface electronically with equipment and little else.
You also have the mechanical drones, these are multi-purpose and come with all sorts of modular attachments depending on what function they are preforming. The basic model has a simple mechanical arm. You also have modules for security drones, gardening drones etc (Who knows, you might even find some tech from the planet to upgrade these capabilities.)
You can set automation of certain tasks for sure and set up the whole station so that the sensors and machines all lie to the meat-bags. It’ll be tricky but it’s just about possible…
Will the player’s drone (or another bot controlled by the player) be able to interact with the environment, eg. to “misplace” certain objects?
Yep! That would be the mechanical drones you ‘dock’ with. You can use your arm to stuff that NPCS meds in the washing machine or move the emergency oxygen tanks to the incinerator. Most objects are interactive in some way, even when they are bolted down there is often hacking actions you can preform on them.
How will the controls work? 5DOF-Navigation, with mouse as two-axis rotational input? Also: will there be access-hatches usable only by the drones?
Everyone can use the access-hatches but you have a fast track system fixed to the roof of the station that propels drones around at speed. Controls are subject to play testing but at the moment it is the standard WASD and mouse with two-axis rotational input yes.
Any A.I. smart enough to pass a Turing test is smart enough to know to fail it.
IAN MCDONALD, River of Gods
A lot has been said about Human Orbit’s AI but we don’t often mention the role the objects in the station play.
Every object you will see in Human Orbit has some sort of practical function or has personal meaning to a character on the station. We have no desk clutter placed solely to make the environment feel real, we have no faked machines designed solely as a backdrop. All draws can be opened, all objects have their use, if you see a washing machine then you can wait and see how the characters use it to wash their clothes.
It’s a working, living station!
“On two occasions, I have been asked [by members of Parliament], 'Pray, Mr. Babbage, if you put into the machine wrong figures, will the right answers come out?' I am not able to rightly apprehend the kind of confusion of ideas that could provoke such a question.”
Charles Babbage – Inventor of the first programmable computer
In an earlier blog post (Framing Effect), we mentioned in passing that we would be updating to a new version of the engine that we are using for Human Orbit. Today, we’d like to share some results from that update and show you just one of the reasons why we have been eager to upgrade. Upgrading our engine to Unity 5 opens up a lot of avenues for us graphically. One of the more significant changes is that it allows us to switch to full PBR (physically based rendering). In PBR, all materials in the game use a unified shading model that corresponds much more accurately to real-life materials and allows materials to respond to light sources in a consistent and accurate way. The result of this improved lighting model is scenes that feel much more cohesive and more natural.
By combining this with the massive FPS improvements that Joe has made in the last few weeks, we’re confident that we can create a look that is distinctive, beautiful and performant. Have a look at these new screenshots of the alga room, and compare them to the screenshots that we shared a couple of months ago. The top image shows the shiny new look of the game, and the bottom image shows the shameful old version. I think you’ll agree that the improvement is immense!
Now we can show our faces in public again!
"...I make friends. They're toys. My friends are toys. I make them. It's a hobby..."
J.F. Sebastian – Blade Runner
What have we been up to at Human Orbit?
Since the release of the trailer we have mainly been working on some housekeeping. We are pulling a lot of things apart and putting them back together in a more efficient way so that the game runs nice and smooth. We have quite a few good ideas for optimisation and I expect the game to be running fairly well even on older machines.
We have also started work on the dialogue tools! Before long we will have an up and running tool enabling us to add dialogue to the game much quicker than before. Of course we are releasing this along with the game for all of the modders out there. I’m looking forward to seeing what you guys come up with!
“By far the greatest danger of Artificial Intelligence is that people conclude too early that they understand it.”