I have never been a huge fan of reading sci-fi. As I read a lot I have touched in on some I have enjoyed, and many I disliked. Yet I ended up writing one.
Here is why:
The last couple of years I started writing books and found great pleasure in the process. I kind of knew what kind of story I wanted to tell, and what I wanted to achieve by telling the story. I wanted a story that integrated a complex power play without defined bad guys. A faction should always be pulled in several directions from the inside based on the ambitions of their figurehead. The main character should be tempted from many sides, and the reader should also be uncertain about what they figured was the right choice. I wanted to tell a story that made young readers reflect on what I perceive as mankind’s dissonance between the materials we are made of and life. It was very hard to write this story without presumptions based on the present state of things or to make huge amount of research in history to make it as realistic as possible. That left me with two choices, either invent your own world, or move this world a little farther in time, erasing much of the disturbances. I chose sci-fi, I read Dune back in the days, and the book really got me. What I extracted from the book which made me love it was that you could remove all the sci-fi and still be left with a magnificent story. So I figured, use that in your own writing. I was deadly afraid that if I started writing about the future I would loose focus on the story and end up in some whacky parody of humanity, like Star Wars (original trilogy), where I felt like you where just on some nature show about the state of things in a galaxy far away.
This made me decide that sci-fi was the right arena for my trilogy. One thing that has always fascinated me is the problems presented when faced with the concept of an AI or a human Singularity, what is humanity? What makes us human? If we make a sentient being like an AI would it have a purpose if it was just a cold calculating machine? Now I am about to go very deep here, but heck, if we created something a hundred percent rational then it would have no purpose of existing. You would always have to insert some man made rationality in such a machine for it to do something. Why should it want to do anything at all? Just like planets or rocks it would have no purpose. It could conquer the stars, yes, but why? And if we made the purpose of the machine to survive, it would ultimately turn on us, as we present a threat, like a man with chickens who stops lying eggs and just starts to consume. I found that this was an interesting starting point in the attempt to define humanity. I found that there is a very fine line between improving humanity and destroy it. For example, I have worked with a lot of young people with depression and suicide tendencies, if I could remove that with a chip or anything I would see no problems in it. But if I my self could remove the irrational parts that made me unable to write for sixteen hours a day, and instead made me sit and EV train my Pokémon even though I do not play competitively and never will, I would feel that I had lost a part of me. Writing sci-fi gave me the tools to dive in to this, and I decided I wanted to jump to the very pioneering era (In my imagination about a hundred and fifty year ahead) diving in to the ethical problems before it impacts humanity fully.