Should You Still Go to the Doctor If You Don’t Think You Have a Serious Injury After an Accident?
Car accidents are traumatic events. With thousands of pounds of metal and plastic violently slamming together, the human body isn’t equipped to handle that kind of force.
Even if your car has the best anti-crash technology, injuries can happen. If you can’t see an injury or you don’t feel pain, you may assume you’re okay. However, injuries may not physically appear until a day or two after your accident, and internal injuries like whiplash and injured organs can occur without you even knowing. After a car accident, you should always go to the doctor. For more information, read on.