Ok, so this has not happened yet, where somebody hacks your pacemaker connects to your phone and says:
Pay up or you heart will flutter.
But according to Threatpost story we are almost there (my interpretation):
“Pacemaker Ecosystem Fails Its Cybersecurity Checkup”
there have not been any cases of ransomware or other cybersecurity issues on pacemakers, but this report suggests that it would be good if some authentication (any) would be built into the devices, as no one knows what kind of shenanigans could be created by criminal hijackers.
And mark this point in time, Criminal Hackers will create shenanigans. There will be methods yet unknown that will be done —
think DOS -or Denial Of Service.
Sure you may not need the pacemaker all the time, but you need it at certain times. What if it does not operate as it should? Whose fault is it? the hacker, doctors, or pacemaker manufacturer.
I found this email very interesting:
“They need to make sure projects meet requirements should it touch any government data
- #1 priority is a technical person, they can teach security guidelines”
The email is a recruiter looking for a certain type of security analyst that will look over the shoulder, review code and help programmers and others to code with a security mindset.
Now you can see that here we have a germination of a security agenda at this entity.
This is a good thing, and this position should be just one part with another part a security testing regime, which I did not see mentioned in the qualifications.
Overall Cybersecurity problem is the complexity and thus needs an unfettered testing department checking on a programming department even one where Cybersecurity is important and built-in.
Just because you write good code with Cybersecurity in mind does it mean it is secure?
One must still test the code to reduce the risk of security problems further.