Apart from philosophical/metaphysical/religious questions, the missing part here will be resilience. "Life" as we define it has proven to be extraordinarily resilient. I don't think any kind of machine will ever be able to be that resilient. We are again trying to mimick things that have taken billions of years to get to the point they are. That's delirious.
Yea like flying. Silly us. We tried to mimick things that can fly (birds), which took billions of years to get to the point they are and look where we ended up. Oh wait ...
Two important differences I see:
Biological life uses basic resources available everywhere on earth - water, air, soil, solar energy. Technology as it exists today relies on rare resources and highly purified materials, which are harder to obtain, easier to disrupt, require more logistics and global cooperation.
Life is fully decentralized. Wherever there is a few organisms and whatever stuff they need to live (not a lot, see above) you can rest assured there will be more of them very soon. Technology is mass manufactured in highly centralized and specialized factories as forced by economic and logistic necessities: it takes a lot of ultra specialized advanced technology to manufacture any other advanced technology and this takes resources and physical space.
Nothing even remotely on the horizon suggests that technology will reach the capability and survivability of human life anytime soon. There is no such thing as AI which 3D prints itself, but any human group can make more humans readily. With some luck and selection they may even become better than their ancestors.
If any technology threatens human life in foreseeable future, it's bioengineering. These guys are messing with stuff where all the difficult problems of survival, replication, spread and evolution are already solved. All it would take to trigger a global scale disaster is to bolt some destructive capabilities on top of that, and there is a lot of destructive potential in various forms of life already. Further advances in biology and genetic engineering are likely to occur because it's a relatively young field of science whose main task is simply figuring out how to make use of things which already exist and have existed for millions of years.
AI is a worry for terminal Dunning-Kruger cases like Yudkowski who think that they are way smarter than everybody else and overestimate the importance of smartness in the first place. Those guys are genetic dead ends anyway, not sure why they even care if they die by machines or by expiry like everyone else.