Home Lifestyle It’s Still Legal To Eat Dogs In The United States–And Yes, It Happens