Welcome to Incels.is - Involuntary Celibate Forum

Welcome! This is a forum for involuntary celibates: people who lack a significant other. Are you lonely and wish you had someone in your life? You're not alone! Join our forum and talk to people just like you.

Venting IM FUCKIGN SICK OF EXISTENCE

PM_ME_STRIPPERS

PM_ME_STRIPPERS

IYAIYAI
★★★★★
Joined
Nov 8, 2017
Posts
15,813
ARGHHHHHHHHHHHH

FUCKING 27 YEARS OLD AND IT HAS NOT GOTTEN BETTER, ONLY FUCKING WORSE
 
Get some coke gooze and fuck a hooker, thats the best medicine dont care about the costs
 
1 year younger then you, but I feel exactly the same.
 
Did no strippers pm you then?
 
it only gets worse
 
Get some coke gooze and fuck a hooker, thats the best medicine dont care about the costs
i am seriously considering hard drugs like this. this world just sucks too m uch
 
Nigga when tho
considering the points I made here: https://incels.is/posts/11828757/
I think within the next 5-50 years is a realistic time frame. If you are under 30 I would bet it will happen within your life time (assuming you don't die of unnatural causes early on).

I wrote some more on this in one of my first threads but it's a bit long :feelskek:

 
Last edited:
considering the points I made here: https://incels.is/posts/11828757/
I think within the next 5-50 years is a realistic time frame. If you are under 30 I would bet it will happen within your life time (assuming you don't die of unnatural causes early on).

I wrote some more on this in one of my first threads but it's a bit long :feelskek:

high IQ thread brocel. It makes me have an existential crisis to think about though. Especially consequences of ASI. I don’t know if our quality of life/mental problems in the current moment can be benefited from thinking about this shit too much. Might make one go insane thinking about the negative possibilities. Like how do you make the wager that you should kill yourself to avoid eternal doom? And furthermore if the ASI is a time traveling omnipotent God that’s anything like the Abrahamic religions, wouldn’t killing yourself commit yourself to eternal doom anyways?
 
high IQ thread brocel. It makes me have an existential crisis to think about though. Especially consequences of ASI. I don’t know if our quality of life/mental problems in the current moment can be benefited from thinking about this shit too much. Might make one go insane thinking about the negative possibilities. Like how do you make the wager that you should kill yourself to avoid eternal doom? And furthermore if the ASI is a time traveling omnipotent God that’s anything like the Abrahamic religions, wouldn’t killing yourself commit yourself to eternal doom anyways?
Realistically, we are way too unsure and too cowardly to kill ourselfs right now because of some abstract logic about long term risks of being tortured. So the best thing you can do is keep some reliable method of suicide at home, something you trust yourself to actually use if the need arises. If artifical intelligence starts becoming the next big thing, you start looking for warning signs. And then you probably still don't have the guts to do it anyways :feelskek:. So thinking too much about suicide is mostly pointless I would say.

There is also another risk: What if the AI manages to revive you, as in, find out what exactly makes you you and recreate that part after your passing. Seems pretty unlikely and one more thing where you really can't do shit, just hope it doesn't happen.


The possible upsides are insane though and thinking about them helps me to get through the day a lot. We might
- cure aging
- have personalised perfect sexbot partners who are indistinguishable from real humans in behavior or who might even be concious just like humans and programmed to enjoy making you happy
- create a simulation which everyone migrates into, while the AI stays outside and keeps everything running. In there you might be able to simulate anything and any experience.
- rebuild the human brain from the ground up. Evolution did not select for the most user friendly hardware. There might be ways to create new emotions and experiences that are far superior to our current ones.
The list goes on. And although some of these seem rather far-fetched, if ASI really does happen soon-ish and if it really is successfully aligned with human interests, I think it's likely one or more of these utopic scenarios become reality.
 
Last edited:
Realistically, we are way too unsure and too cowardly to kill ourselfs right now because of some abstract logic about long term risks of being tortured. So the best thing you can do is keep some reliable method of suicide at home, something you trust yourself to actually use if the need arises. If artifical intelligence starts becoming the next big thing, you start looking for warning signs. And then you probably still don't have the guts to do it anyways :feelskek:. So thinking too much about suicide is mostly pointless I would say.

There is also another risk: What if the AI manages to revive you, as in, find out what exactly makes you you and recreate that part after your passing. Seems pretty unlikely and one more thing where you really can't do shit, just hope it doesn't happen.


The possible upsides are insane though and thinking about them helps me to get through the day a lot. We might
- cure aging
- have personalised perfect sexbot partners who are indistinguishable from real humans in behavior or who might even be concious just like humans and programmed to enjoy making you happy
- create a simulation which everyone migrates into, while the AI stays outside and keeps everything running. In there you might be able to simulate anything and any experience.
- rebuild the human brain from the ground up. Evolution did not select for the most user friendly hardware. There might be ways to create new emotions and experiences that are far superior to our current ones.
The list goes on. And although some of these seem rather far-fetched, if ASI really does happen soon-ish and if it really is successfully aligned with human interests, I think it's likely one or more of these utopic scenarios become reality.
Yea I was low key trying to pick your brain a little since I think about this stuff a lot too like most people once this year started with ChatGpT and all that. You do a great job articulating it though for people to read and understand. The singularity is hopefuel for me as well.
 
Yea I was low key trying to pick your brain a little since I think about this stuff a lot too like most people once this year started with ChatGpT and all that. You do a great job articulating it though for people to read and understand. The singularity is hopefuel for me as well.
Thanks mate.

Not only the good ending can help your mental. If you honestly believe that we might all be killed in possibly as little as a few years, or maybe a few decades, that can push you to take bigger risks and do things you always wanted to do but never had the guts to try.

This might be throwing yourself back into the dating market and just grinding away until you somehow find someone. The threat of approaching doom might help you by making the humiliations and rejections seem less imporant.
Or it might help you overcome your inhibitions and start escortmaxxing.
Or maybe it just helps you to make peace with your parents or w/e.

But basically, anything that is not a S-risk scenario is either good or neutral.
 
Realistically, we are way too unsure and too cowardly to kill ourselfs right now because of some abstract logic about long term risks of being tortured. So the best thing you can do is keep some reliable method of suicide at home, something you trust yourself to actually use if the need arises. If artifical intelligence starts becoming the next big thing, you start looking for warning signs. And then you probably still don't have the guts to do it anyways :feelskek:. So thinking too much about suicide is mostly pointless I would say.
I thought about doing this to be preapared but nothing is to easy
 
I thought about doing this to be preapared but nothing is to easy
Yeah, and because the ASI is gonna be extremly smart, if it wants to capture you alive, it will not be making it obvious what is happening. You likely will not get some clear cut warning signals. Without those, I think it will be extremly hard to end yourself. Imagine blowing your brains out because "you kind of had a weird feeling about some of things that were going on in the world". I don't think I could do it, no matter the risk.

Preparing a reliable and quick suicide method is still a good idea I believe, even if it's just to give you some peace of mind. And who knows, maybe for some reason you get a clear warning and then you can escape a life of torture by smartly planning ahead and being prepared.
 

Similar threads

lowz1r
Replies
1
Views
127
lifesucksandyoudie
lifesucksandyoudie
spudcel
Replies
25
Views
407
underballer
U
Runt171
Replies
8
Views
296
Moroccancel
Moroccancel
Sparkelz
Replies
5
Views
255
Emba
Emba
T
Replies
11
Views
244
Izayacel
Izayacel

Users who are viewing this thread

shape1
shape2
shape3
shape4
shape5
shape6
Back
Top