Life is not what we think it is. We're constantly being inundated with information, advertisements, media on being told who to be, how to be, how to act, etc. Maybe it's time we break out of the societal norms we're expected to uphold...
Since 2020, a lot has changed in the world. The question now is, is this the end of Corporate America as we see it with people changing their focus on what life is really about?