Lesswrongers refers to the LessWrong community. This community is generally interested with alignment. There have been quite a few interesting allegations but I will let you look into that on your own without my opinion.
LW and rationalists have a lot of overlap, LW might even be a proper subset of rationalists.
The “rationalist” community has as the name implies reasoning as its basis, basically trying to use reason to generate and test knowledge, where reason has some degree of rigour.
There is also another movement or loosely coupled collective of people with great overlap with rationalists, those who refer to themselves as “longtermists”. Loosely speaking, longtermists claim to think about very long term issues that concern humanity, instead of immediate issues. Think hundreds of years further down the road.
I think acc may have been intended to be ACX / Astral Codex Ten (=X), which is a community around Scott Alexander. A prolific author, member of LW and well known Rationalist.
I attempted to make this comment devoid of my opinion of said people. Hopefully this wasn’t too handwavey.
e/acc l/acc r/acc u/acc c/acc etc. and completely mask off adjacent ones like kali/acc
calling longtermerists long term thinkers is stopping a bit short. specifically they believe that long term considerations must include billions/trillions of future unborn AI minds that require as much humanity and care, even starting now, in proportion to current/future smaller numbers of physical beings. a lot of wacky thought comes out of these foundations
>specifically they believe that long term considerations must include billions/trillions of future unborn AI minds that require as much humanity and care, even starting now, in proportion to current/future smaller numbers of physical beings. a lot of wacky thought comes out of these foundations
This is a clear strawman and not representative of what actual longtermists believe.
LW and rationalists have a lot of overlap, LW might even be a proper subset of rationalists.
The “rationalist” community has as the name implies reasoning as its basis, basically trying to use reason to generate and test knowledge, where reason has some degree of rigour.
There is also another movement or loosely coupled collective of people with great overlap with rationalists, those who refer to themselves as “longtermists”. Loosely speaking, longtermists claim to think about very long term issues that concern humanity, instead of immediate issues. Think hundreds of years further down the road.
I think acc may have been intended to be ACX / Astral Codex Ten (=X), which is a community around Scott Alexander. A prolific author, member of LW and well known Rationalist.
I attempted to make this comment devoid of my opinion of said people. Hopefully this wasn’t too handwavey.
reply