Professional Interests

Crazy projects that I really want to do and which could change the world:

Other general interests areas:

  • low-level, performance critical technologies that increase humanity’s computational capacity: operating systems, GPUs, integrated circuit design, compilers, network programming, databases, file systems, cryptography, virtualization, optimization, numerical analysis, …

Things I’ll try to get my kids into because I don’t have enough time in this life:

I really like open source. It gives more opportunities to the poor countries, and reduces lock-in to my current employer. Shame it is not always economically feasible…

I can speak a few languages but I strongly prefer to produce content (code, documentation, reports) in English, since it is more productive to the world as a whole. I don’t mind speaking other languages at the office however.

I’m not a big fan of working remotely, because you don’t feel people’s love as strongly, and usefulness is built on love. But please, give me a small silent office so I can concentrate instead of a silly open space, and create an internal social network so I can see what others are doing. Remote working can only be attempted if the majority of the team also does it, otherwise you will get excluded. Maybe after VR…

For large resources, I prefer textual information (e.g. wikis, issue trackers) over non-textual (e.g. videos, meetings), because it is cheaper to make, modify, searchable, often more precise, and takes less disk space and bandwidth. A sequence of well chosen images is better than any video. Videos must be short and juicy, just to share the love.

I am also interested in academia. I like to teach (people who want to learn) and produce optimal learning material.

I can’t stand crappy tools and documentation. If you have crappy tools, I will end up fixing those tools instead of doing what you tell me to do. Which might lead to me quitting because I can’t stand the tools, or you firing me because I’m not doing the job you think I should be doing.

If our interests coincide, ping me even if you don’t have an specific proposal immediately. I want to know that you exist :-)

Why I am interested in those things

I am bewildered by the beauty of mathematics, science and engineering.

I almost went into theoretical mathematics or physics just for the beauty of it, but then:

  • I got tired of seeing people begging on the streets every day and not doing anything about it
  • it was so hard to find good and free sources to learn maths and physics, even if you wanted to learn
  • the current educational system gives you very little choice or motivation to learn useful fun stuff
  • I found out that programming and applied mathematics were fun!

So I decided to devote myself to applied stuff in order to:

  • make society richer, so that people will have more free time to do fun stuff!
  • better organize information and information generation methods so information can be free
  • destroy the current educational system and replace it by one that lets people choose what they want to learn
  • still have some fun myself!

Sci-fi

Unconditional basic income is my ultimate non-transhumanist dream: to reach a state of technological advancement and distribution of resources so high that everyone gets money for doing nothing, enough for:

  • basic survival needs: food, housing, clothes, hygiene, etc.
  • two children to keep the world going. Or immortality tech, but I think that is harder and borderline transhumanist :-)
  • high speed computer and Internet

Once a person has that, he can learn, teach and create whatever he wants. Or play video games all day long if he so wishes.

I don’t think I will live to see this, so I content myself with helping it happen faster by increasing the efficiency of the world as I can.

Wikipedia list of pilot projects.

Technologies which would help a lot towards unconditional basic income and might be required but which I don’t think I will live to see:

AGI is the most likely of the above.

AGI + humanoid robots likely implies AI takeover.

But AGI alone would be very dangerous, in case it can get control of our nuclear arsenals through software zero days or social engineering. Although some claim that is unlikely.

Comments
comments powered by Disqus