My cliff notes from You don’t get to know what you’re fighting for:
Knowing what specifically you are working towards is hard.
It is easy to specify what you want negatively — what outcomes don’t you want — but that only tells you want you don’t want, not what you do want. It’s much harder to positively state what you want.
Then, it’s often true that when pursuing a specific goal, our goals shift. For example “help the poor” can turn into “get rich to donate a lot”.
With our current understanding of philosophy, it is highly likely if not inevitable that even simple altruistic goals will shift as we progress toward them or as we investigate our philosophies that underpin them.
[This shakiness of philosophies Nate points at to question several example goals is why I’ve limited the time I spend investing in philosophical understanding. It’s always seemed to be the further down philosophical trails I go, the less it helps with anything.]
Even when you think you know what you’re fighting for, there’s no guarantee you are right, and since there’s no clearly established objective morality, there’s likely to be an argument against your stance.
There is no objective morality writ on a tablet between the galaxies. There are no objective facts about what “actually matters.” But that’s because “mattering” isn’t a property of the universe. It’s a property of a person.
There are facts about what we care about, but they aren’t facts about the stars. They are facts about us.
Then it is also possible that our brains are lying to us about our intentions. We don’t have enough introspective capability to truly understand our motivations, which are often grounded in hidden and arbitrary rules of natural selection.
My values were built by dumb processes smashing time and a savannah into a bunch of monkey generations, and I don’t entirely approve of all of the result, but the result is also where my approver comes from. My appreciation of beauty, my sense of wonder, and my capacity to love, all came from this process.
I’m not saying my values are dumb; I’m saying you shouldn’t expect them to be simple.
We’re a thousand shards of desire forged of coincidence and circumstance and death and time. It would be really surprising if there were some short, simple description of our values. […]
Don’t get me wrong, our values are not inscruitable [sic]. They are not inherently unknowable. If we survive long enough, it seems likely that we’ll eventually map them out.
We don’t need to know what motivates us or what we want exactly; “The world’s in bad enough shape that you don’t need to.” We can have something to fight for without knowing exactly what it is. We have a good enough idea of which direction to go. Depending on what we are pretty sure of is good enough for us to decide what to do next.