

A few years ago, blatant journalistic malpractice was a controversy.


A few years ago, blatant journalistic malpractice was a controversy.


Are people so lazy they can’t even bother to read the headline? Maybe an AI would’ve been useful here to generate its own defense.


Nah they’re made aware.
I mean, I guess a state could’ve passed the law, saying “hey, leave newborns at fire stations” and not informed the fire house, but it seems far more likely that they are informed.
But States either have designated boxes, or you hand the child over to them directly. You don’t just leave it in front of the firehouse door.


So, in Quebec, according to this article, they passed a law requiring facilities to let it happen on-site.
That’s all that needs to happen, a hospital has any specific equipment on hand, and be willing to let in a doctor who is OK with it.
I agree with you, that no individual person should be forced to kill someone, but a hospital isn’t a person and doesn’t have feelings. There’s a very reasonable chance someone works there who would have been OK with providing MAID, but doesn’t, and even if 100% of the doctors there weren’t OK with it, it’s a lot simpler to have a doctor travel than it is to arrange a whole new bed, ambulance, on-site doctor, and family.
To me at least, that IS negligence. It’s not a violation of any individual’s beliefs that MAID happens in their general vicinity, and it’s just not true that requiring a facility to allow it results in requiring individuals to perform it.
Also, less relevant, it’s not necessarily that the vehicle can’t keep the patient alive, it’s just that there is a chance of the patient passing at any time, and that time might be during transport in an ambulance that is designed for emergencies first and doesn’t accommodate families.


The answer is yes, this is exactly what sites like fiverr are for.
That is, if you value your time more than your money for this, because there’s probably a way to still semi-automate it and avoid some of the work. But yes, fiverr.


I think humanizing them is a fairly trivial thing, in this sort of context.
Yes, it’s true, it didn’t “lie” about health.
But it has the same result as someone lying, it’s another bulletpoint in the list of reasons not to trust AI, even if it pulls from the right sources and presents information generally correctly, it may in fact just not present information it could have presented because the sources it learned from have done so in a way that would get those sources deemed “liars”.
Could write that out every time, I suppose, but people will say their dog is trying to trick them when he goes to the bowl 5 minutes after dinner, or goes to their partner for the same, and everyone understands the dog isn’t actually attempting to deceive them, and just wants more.
Same thing, to me at least. It lied, but in a similar way to how my dog lies, not in the way a human can lie.


No, they’re definitely also expanding.
Not all of them, certainly, but there are a few plans for new factories. Samsung, for instance, is rolling out a new chip factory, if you want something to search.
The root means “slow”, BTW, so it does get to join that list.