Raise your hand if you’ve been on a diet before! [Insert raised hand black girl emoji]. Diets have been around for forever and they have done more harm than good to our society; especially with the younger generation. So what’s the truth about diets? What is diet? Before we begin, what is diet? Diet is …
Diet Culture- What is it?
Diet Culture is anything that tells you that you should look a certain way to fit into society’s beauty standard. It encourages you to cut a certain group of food into your diet (what you eat) because it will make you fat; ie: carbs. Diet Culture makes one believe that she/he should be thin in …