Does life getter better after high school?
Life is immensely better after high school. People do really grow up. I have seen people I went to school with that were the most horrible bullies now having polite conversations with others, not mocking them or teasing them. I spoke a few times with one girl, and it was like being in a different world. She never talked to me in school unless she needed something from me, but after we got out and I saw her, she was just a normal person. I think people become more of themselves after they get out of school. I never really thought about it much even for me, but I’m a lot happier. I feel more like the person I was meant to be instead of who I was before.