Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
DARPA to launch efforts that will bolster defenses against manipulated media (darpa.mil)
65 points by geox on March 16, 2024 | hide | past | favorite | 115 comments


A good place to start might be in mandating some level of media literacy in the national curriculum.

Media Studies tends to be a bit of a meme but it’s becoming increasingly important.


Agreed. This is a social problem and no amount of technical solutioneering will solve it. A significant chunk of the US believes Obama is a Muslim and/or Kenyan on the weight of approximately no evidence. Large numbers of people treat images of people with glossy skin and mangled hands on Facebook and Instagram as photographic fact. VFX videos that would be lambasted as amateurish and unconvincing in a film are treated as undeniable evidence by many commenters on YouTube and TikTok. These are not highly sophisticated lies, and yet many people accept them uncritically because they haven’t learned any better.


They have in fact learned the opposite. We've spent decades carefully separating ourselves from even the most obvious critical thinking skills that a child would employ. Long before sophisticated disinformation tools people readily slurped up the most obvious lies.

I'm really at a loss.


> yet many people accept them uncritically because they haven’t learned any better.

The political establishment doesn't necessarily want critical thinking - they just want you to believe their version of lies. People uncritically accept anything negative about Trump all the time and people who own media channels don't want that to change.


Right on. Also just having the same high quality US civics classes, like I had when I was a kid, would help young people develop scepticism, learn the value of reading good books, often classic books, instead of hours on social media.

While I think the Patriot Act was a horror show for the health of our country, and I am suspicious of the very broad reach of the proposed ban TicToc legislation, the military system also does a lot of good work that benefits our society in non-military ways.


Just mandate all media to be in latin, and maintain a class of experts versed in the language to interpret it to the masses.


The issue is that it will be seen as "political" if one political side's media happens to be cited more often in examples of misinformation, and one will.


I don't think there'd be an issue in finding sufficient material for any political side.


Selecting from a biased population into an “unbiased” sample is itself also political. Like having one fringe scientist on stage with one mainstream scientist, giving the illusion that their ideas have equal acceptance among their peers.


Sometimes that fringe scientist turns out to be right. Ideas should be judged on their own merit, not based on the resistance of those who've built their careers on the currently accepted ideas.


That’s a nice maxim and all but in actual reality, the vast majority of people on the vast majority of topics have to use heuristics to build (and disconfirm) their beliefs. The people who are actually close enough to a topic to even be able to judge an idea “on the merits” are few and far between. If you want people to have an accurate model of reality, you have to care about maintaining heuristics that at least somewhat track with the consensus (or the lack thereof) of the underlying group of people who actually have a chance of knowing things from first principles and direct experience.

Deliberately sampling an “unbiased” subset of an actually biased underlying population is not just allowing the heuristic to stray from reality, it’s deliberately destroying that heuristic and deliberately misleading people. In imaginary “everyone can/should judge on own merits”-land, that’s perfectly fine. In reality, where almost no one can actually judge almost anything of substance on its merits, you are actively sabotaging truth-seeking.

The way that fringe scientist gets to declare s/he is right is by convincing their peers of it, not by convincing the public in a format that violates the heuristics the public has to rely on. Of course everyone should always be open to this possibility, most of all their peers (and to your point they often aren’t!)


I wish more people understood this - that most (all?) people are missing the required years of knowledge and understanding of nuance to be able to judge an idea "on its merits", particularly scientific/technological ideas. Trusting expert consensus - while certainly not perfect - is the only logical (and relatively safe) way forward.

What's of even more concern to me is the prevalence of this in the HN crowd - ostensibly a bunch of very intelligent people with expert knowledge in their respective fields and who thus understand what goes into actually understanding something...


Gell-mann amnesia effect is a powerful drug.


Could you please elaborate on what you mean by this?


Scientists get to declare they are right by finding evidence that their theory works. Convincing their peers makes that easier (they'll have more resources and help in the search for evidence), but ultimately the evidence should be clear to everyone.

Do germs grow in a heated, sealed flask? No. Clear evidence for germ theory, disproving the accepted view at the time, spontaneous generation.

Do nuclear fission and modern electronics work? Clear evidence for quantum physics, even though Einstein argued against it till the end.

Is GPS accurate before or after adjusting for time passing faster in orbit? (After.) Clear evidence for general relativity, despite how wild it seems.

On the other hand, when a group of people collectively tell you their ideas are right, and that other ideas are "fringe", without clear evidence, that's an ideology or a religion.


To quote Max Planck: "a new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents die, and a new generation grows up that is familiar with.". Groupthink holds back science.


Now that you mention it, I wasn't even thinking this through in my original comment.

Even if you found a perfect balance, I can imagine little Bobby coming home from school to see his parents invested in their favorite TV news program and saying: "Mr. Forrester told us that's a type of misinformation, mom."

I would not want to be Mr. Forrester in that scenario. Just imagine the school board meetings.


Media (and corresponding manipulation) could be created specifically for teaching purposes, aiming to keep the material as apolitical as possible while still retaining the techniques of manipulation.


That actually sounds like a great idea.


That is a solvable issue -- give examples of manipulation and misinformation for both sides in roughly equal quantities. There is no shortage of examples for any argument.

Nothing helps making people manipulation resistant like seeing manipulation from the side of the argument they support.


That still doesn't solve the issue, manipulation and misinformation are entirely personal and highly emotional.

See the current status on inflation and words from the white house. Or the situation between Palestine / Israel.

You are always being manipulated and lied to by both sides because it is in their best interests to have you aligned.


This is exactly what I am suggesting to address.

We currently have separate narratives of "<side 1> engages in manipulation, here is their misinformation" which polarizes and causes highly emotional reactions against the evil supporters of that side. This also means that people, due to that emotional response, are unwilling to consider opposite views and we get the polarized society we have today.

If instead we are teaching with examples of manipulation of both sides of the same argument, people see the value of debating on merits, instead of emotions, cross checking references and correcting for the speaker's viewpoint. They also get less affected by misinformation, since they have seen it, many times, including coming from the side near and dear to their hearts.

This last thing is critical. One would not get too riled up about messaging if they see both sides doing that. My 2c.


I was taught that multiple times under sections on political cartoons/propaganda, at least when I went to school.

The issue is not that people are not well educated, its that there is not enough reason for individuals at a large scale to invest time deep-diving into a topic when they can use other things like social cues and popularity to more or less decide where they fit on the spectrum.

The issue is not that manipulation is invisible to the untrained eye, on the contrary I'd argue a lot more people are aware of political manipulation than you realize. It doesn't take much logic to see politicians only start caring about you once elections come around, regardless of where you fall on the political slider.

People are optimizing for social harmony and the security of their livelihood. This means that more or less whatever the opinions of those in my community / work / place of worship are, I will reciprocate those signals to some degree or another unless there is an issue that resonates strongly enough. Insert mass-scale reach with social media and the law of large numbers in human population and I hope you can see why there is no non-technological solution to this problem.

We have created a dopamine machine in our pocket capable of satisfying the natural urges of every permutation of the mind. Your bigger picture and moral appeal is given a moment of consideration by an ever-shrinking percentage.


There are more than two sides.


How useful is it to teach media literacy when nobody applies it? I'd argue it's the opposite of useful: Harmful. People who learned media literacy feel more resilient against disinformation, but they likely don't care enough about any single news piece to apply the frameworks to verify whether anything is indeed correct. So, quite ironically, the fact that they feel more resilient against disinformation makes them more susceptible to disinformation than the average uneducated person in terms of media literacy.

Figure out how to teach media literacy correctly before spreading the current, rather flawed media literacy teachings, so we don't end up with more wannabe media literate people who are actually more susceptible to disinformation than the average person.


> How useful is it to teach media literacy when nobody applies it?

How can people apply something they've not been taught?


Throw in some basic statistics too? Many of the graphs cookers throw into their Instagram posts are from real studies but have been manipulated to push misinformation.


"Media literacy". Aka, "How to interpret everything through the horse blinders and funhouse lens of the dominant neoliberal ideology of the US ruling class."


I'm British and thinking about this in that context, not an American one, for what it's worth.


> AI Forensics Open Research Challenge Evaluation (AI FORCE)

You expect good acronyms on a .mil website, but in my mind this one is worthy of praise.


As an adult can I opt out of this and decide for myself what media to consume?


Take it you didn't read the article. The "Media" in "Media Manipulation", in this case, is like "a picture" or "a video", not "The Media" (as I also initially thought.)

This is a set of tech meant to ID deepfakes, not to restrict access or to disseminate messaging generally. It's not a bad thing to exist, if it can actually be done. I'd prefer to know if a picture is bunkum.


I would hope that this is made open source tho. I don’t really trust any government and/or megacorp contractor to tell me honestly whether or not something is fake/real.


The end goal is "The Media" manipulation. Idea being you automate the identification of disinformation, automate alerting social media websites of disinformation. The website themselves will of course be incentivized to automatically remove disinformation based on the alerts.

Practically speaking, it will allow mass manipulation of public opinion for pretty much any topic by identifying opposing opinions as disinformation and automating the process nationwide or worldwide.


What is the opting in or mandatory part of this?


If people's needs aren't being met and they decide to follow one party or ideology, that's just the way the cookie crumbles.

This is democracy. To forcefully change people's minds, regardless of what party or beliefs they have, is authoritarian.

This desire to bolster truth will be political. It comes from the same place as the idea that we should censor certain discourse or ban certain books.


Weirdly enough, according to the fine print of the internet no no you can't.


The Ministry of Truth.


Does that include checking against the intelligence community for touting false information? Askin' for a friend.


Time for an anti-memetic division.


SLR photo chip -> Cryptographically Signed Image -> Blockchain or Other Trust System Documents Production of Image and with key system thar matches image to record, all alterations to image are documented on said blockchain/other trust system. Verifiable system for chain of custody.


> SLR photo chip -> Cryptographically Signed Image -> Blockchain or Other Trust System Documents Production of Image and with key system thar matches image to record, all alterations to image are documented on said blockchain/other trust system. Verifiable system for chain of custody.

You can have a fully verified factual photo that is still a lie.

Early pandemic had some great examples of this. Journalist wanted to show how crowded cities like NYC were despite distancing rules. So they took telephoto shots of people standing in line facing the camera. This squeezes the perspective and makes them look bunched up. Same photo taken with orthogonally to the line shows that people stand 6ft apart.

Here’s an article that explains this in more depth: https://mainichi.jp/english/articles/20210126/p2a/00m/0op/00...

You don’t even have to cheat that obviously. A photo can lie just by showing, say, a bunch of soldiers holding guns next to scared children. Oh no look at those scary foreign soldiers terorizing the populace! What you don’t see is the humanitarian aid truck behind them and the workers distributing food to those children. Turns out the soldiers were there to protect the aid process, not to terrorize children.

This is not a technology problem.


> Early pandemic had some great examples of this.

The American civil war has some great examples too.


Take the SLR camera, hook it up to a computer with a custom driver, every time you generate a new picture through AI, invoke the camera keying process programmatically. Most likely don't even need the camera when an exploit is found for the hardware.


I don’t think you even need that level of sophistication. You can probably get “good enough” results just pointing the camera at a high-res screen displaying the manipulated image and putting it slightly out of focus.


Possibly a new use case for light field cameras which is able to capture depth information from their sensors.


TPM is that broken?


Cut the sensor out and wire it to USB.

Or get a premade Chinese device that will do it for you, just like they happily broke HDMI encryption to make splitters work.


My one claim to fame gets this done years ago.[1]

[1] https://patents.google.com/patent/US6757828B1/en


I love the idea of universal traceable media and data. I'm sure that will in no way be a bad thing.


Precisely zero photos you see in media are unedited/uncropped "straight out of the camera" (as we say). Professionals don't even shoot in formats that web browsers can display. I'm certainly not posting my 160MB raw files to the web.

What you propose is thoroughly a non-solution (to a non-problem).


The idea is that the raw photo files are cryptographically signed. Of course they're going to be edited, retouched, etc. But if a media outlet is challenged on the veracity of the photo - as in, claiming it's totally fabricated, not just touched up - then the signed raw file adds credibility.


Damn, I always thought that this was Ministry of Truth's job ...


I can't wait for an official government arbiter (or better yet a "partner" not bound by the constitution or FOIA disclosure laws) deciding what is and what is not foreign manipulated media. Hopefully they get this going before the election before Americans start electing the wrong candidates.

Rest assured it's for our safety. Those pesky Russians are always trying to destroy us!


Isn’t it obvious that Russia and China have been waging an information war against the U.S.? Isn’t it in the national and societal interests to prevent a foreign power from sowing dissent amongst the citizens of the country?

It’s been estimated that around 50% of the comments on the internet are bots. It’s cheap and easy to target people with messaging that influences their behavior. It’s worthwhile to try to prevent this.


It isn't russia/china fuckery OR usa autofuckery. It's russia/china fuckery AND usa autofuckery.

And to further fuzz the culprits, the usa/china/russia people invest in each other's "countries".


It’s ok to try to stop a bad act from happening even though other bad acts are also occurring.


Attempting to fix the cracks in society itself (including the corruption and shocking lack of honesty from many western government officials) is likely to be more fruitful then attempting to stop adversaries from pointing the cracks out.

If people feel represented and informed and wisely governed and have a stake in the system, adversarial information wars will be much less effective.


It’s hard to have an “informed” society while simultaneously not trying to fix the disinformation problem.


You can't fix adversarial disinformation with domestic disinformation.

Trust in western governments is not great at the moment. The primary cause of this is probably not adversarial disinformation.


You should read the article. No reasonable person is proposing a domestic disinformation campaign. The proposal is for a information campaign. To identify fakes and let people know at the time they first encounter whatever it is that was faked.


The Establishment: We have to lie to them for the good of society!

Also the Establishment: We also have to stop disinformation!


Not all information is disinformation.


It’s been estimated that around 50% of the comments on the internet are bots

That's just the sort of false misinformation being propagated through the internet and social media these days that makes it difficult for us as a society to properly assess and deal with our problems.

There is no way that only 50% of comments on the internet are sock puppets of one kind or another.


It is way worse and that's not going to get better: AI trolls.

Strap a neural net to an headless blink/webkit with a virtual mouse and keyboard.

Click farms would be used to generate the data for machine learning.


> There is no way that only 50% of comments on the internet are sock puppets of one kind or another.

No way? Why not? First the content doesn't have to be original, just a variation of some provided template, then it doesn't have to be unique. Now combine this with automated tools and voila - even a single person can produce a lot of content but now multiple it with multiple accounts re-sharing it and you can increase it by order of magnitude.


pretty sure he was being sarcastic. which he should be.


[flagged]


@dang we o.k. w/ ad hominem at HN now?


This isn’t ad hominem. I read every single thing they ever posted to HN (in context) before making that, quite frankly generous, assertion.


I'm flattered! Although I will respectfully agree to disagree with your summary judgement against me.


They can be correct regardless of their motivation.


I... actually can't wait? You're not wrong that it can be overused but there is risk in both directions and pretending like there isn't just fully exposes you to one of them.


Yeah I've got to say I'm not sold on OP's whole "let's embrace chaos!" approach.


The USA has 248 years of legal jurisprudence regarding freedom of the press. I would personally not categorize it as chaos.


Of course but the point stands. There are opposing forces here and it would be foolish to care about one and ignore the other.


Obviously foolish. One of the downsides to this format is that you're exposed to opinions nobody would bother to maintain face to face.


It’s pretty amazing that $100k can decide an election much better than $1B is able to achieve.

The simple conclusion is traditional campaigns are money laundering and money wasting operations but you really want to win, just pay the Russians and Chinese a rounding error of an amount and they’ll change the outcome in your favor!


The arguments the US is using is literally the exact arguments Iran uses for why the censorship.

It's fascinating to see Americans cheer this on.


The same argument can be valid in one context and invalid in another when the premises differ.


Arguably the Iran argument is truthful. The US argument doesn’t really hold water (I haven’t seen any evidence of China actually manipulating the media shown on TikTok)


China also makes a good BBEG.


Especially since it genuinely is. I’m not claiming the U.S. is better in an objective sense but clearly the two countries are in opposition to each other. China clearly would love a divided U.S.


The US loves a divided US, too


Consider the possibility that this due to influence by foreign controlled bots and targeted information warfare against American citizens.


It could be both a) foreign influence and b) the media and politicians benefiting from polarisation.


It is indeed. Yet a foreign government can't pass laws curtailing the freedom and invading the privacy of a citizen on their own country's soil.


It is in the citizens interest that the do gooders be divided so they get nothing done.


I tend to mistrust the motives of the government warning me to fear the nebulous influence of commie spies lurking around every corner and that I should accept government sanction and censorship to protect me from their sinister mind control algorithms than I do the red menace.

Dissent is good. It's a vital component in any free society. The first two amendments of the US Constitution exist to preserve the right of the people to dissent, to the point of violence. Theoretically. Legally you can't just shoot a politician in the head, however cool Thomas Jefferson might have been with it. Point being, the US is supposed to be a society in which instability and dissent is always present, threatening but hopefully never toppling the edifice.

Of course that requires an educated and informed populace capable of discerning between critical thinking and cynicism, likely truth and horseshit, and a free press that isn't wholly subservient to corporate and government interests, and a lot of civic and cultural traits the US lacks. But that's what we should be focusing on, and that remains the case regardless of whether one is talking about Chinese propaganda or American. It's all the same. Over half of the anti-vaxx conspiracy content spread across social media during COVID came from only twelve people[0], one of whom was a Kennedy.

Mind you, i'm pro deplatforming - as long as it's the platform's decision. Platforms should absolutely be free to reject whatever they like on whatever terms they wish, freedom of speech doesn't compel anyone to spread that speech against their will. But when the government steps in and says it's on my side, I assume that eventually they'll just want to send me to the camps.

[0]https://www.npr.org/2021/05/13/996570855/disinformation-doze...


and a lot of civic and cultural traits the US lacks

Indeed, that feels like the root of the matter. If men were angels we wouldn't need laws. Real people are messy, and bumper-sticker principles don't adequately characterize us well enough to design a set of pragmatic laws.

A real compromise leaves everyone mad, and given how mad we are already I'm at a loss to suggest how to proceed. I'll be honest that I'm pretty angry myself; I think we've allowed ourselves to be taken in by some pathetically obvious propaganda. I wish we were at least talking about subtle and interesting problems. Instead we seem to be gleefully re-litigating the obvious.


As with most things in life moderation is essential. Dissent is good if it isn’t too extreme. Exceptions abound. Hence the intractability of the current threat to the U.S. Disinformation will win out. Look at how many people stupidly don’t vaccinate their kids against polio now. The evidence is so overwhelming that the vaccine is beneficial and good for society that no reasonable person can possibly be opposed to it being mandated. If something as obviously beneficial as the polio vaccine can be made into a divisive issue then there is no hope.


BBEG: Big Bad Evil Guy/Gal


The original title is "Deepfake Defense Tech Ready for Commercialization, Transition" but has been changed to "DARPA to launch efforts that will bolster defenses against manipulated media". The changed title is unnecessarily unsettling and misleading. I don't want to see the title changed to a clickbait title like a lowbrow gossip magazine would give it.


I don't want to see the title changed to a clickbait title like a lowbrow gossip magazine would give it.

The title is taken from the subheading which is might be unnecessary in this case but is usually fine. The poster hasn't made up some clickbait title.


It means that it is malicious to replace the title with what would have been the subheading because it looks like a clickbait title. Subheadings do not mean that anything can be changed into a title if it is a subheading. The fact is that this thread is now occupied by a discussion of bike yards that has nothing to do with technology.


It's not 'malicious', its relatively standard HN title practice. If you think the submitter got it wrong just email hn@ycombinator.com and they'll take a look at it.

https://hn.algolia.com/?dateRange=all&page=0&prefix=false&qu...


That's not what I mean. Well, it probably was not translated well.


Seems like a cover for publishing false evidence of crimes that will be used to justify military invasions.


Why would they need to publicly announce a new program to continue doing that?


To get more public buy in and quiet dissent. The current climate is somewhat hostile to even supporting war much less directly engaging in it.


They didn‘t need any such cover for the Iraq war.


They may not have needed it, but it happened nonetheless.

https://en.wikipedia.org/wiki/Nayirah_testimony

There are recent parallels, but this time by Biden who was fed false information. Though this time, the white house retracted that statement immediately instead of waiting for things to blow over[1].

I suppose the difference was that this time, the US government had nothing planned to capitalize on the lie.

[1] https://nypost.com/2023/10/11/biden-ive-seen-pictures-of-ter...



I don't quite see how that is relevant, as it is something that really happened, albeit with disputed cause.

The two examples above are fabricated and, going back to the point of the comment chain, were caused by false evidence (testimonies).


I disagree. Nayirah testimony may have been exaggerated, but it didn't mean that Iraqi army didn't commit wide range of atrocities in Kuwait.


That sounds similar to Lantos' statement

>"given the countless cases of verified Iraqi [here: Israeli, the author] human rights violations", it was "unnecessary and counterproductive to invent atrocities."

In that sense I do see the connection between that made-up event and the bombed hospital, in that it doesn't really matter if this specific case happened, there's enough true ones to go around.

That is to only say that I now get your point, not that I agree to what I would call muddying the waters.


Is there a difference


Yes. After Obama failed to renew the Smith-Mundt Act, now they can feed propaganda intended for a foreign audience, directly to your smartphone.


[flagged]


Murthy v. Missouri is being heard before SCOTUS this week


Such an American point of view that lies, deception and falsehood have primacy over honest communications and information. People keep discussing theoretical results that truth will win out while ignoring the vast amount of actual data showing precisely the opposite. I am tired of the praises of "free speech" when all we see is conspiracies, propaganda, cons, scams, ads, and billionaires censoring those they don't like and amplifying those they do like.

Sure, I am aware of the real concerns of government censorship, but the modern internet and media has basically became a vast refuse heap. Is this really the desired end goal? Is there a better path forward? The path we're on has abdicated total control of information to corporations and billionaires; those I have no control over. At least with government I have a vote to influence its direction.


> I am tired of the praises of "free speech" when all we see is conspiracies, propaganda, cons, scams, ads, and billionaires censoring those they don't like and amplifying those they do like

So you'd prefer something like the Chinese internet, where everything is rigourously fact-checked by the government and anything that reflects negatively on the government is censored? Because empirically that's what happens 100% of the time when you give the government the ability to censor media: it censors media that reflects negatively on it.


I think this is a false dichotomy. There are more options than two. We don’t have to choose between the free-for-all of “conspiracies, propaganda, cons, scams, ads” and Chinese State Censorship. There are many middle grounds.


At least DARPA has some success stories in terms of delivering useful technology.

Still, I feel what you are saying. If I am hesitant, it is because I think I saw actual AI video in the wild yesterday and it was convincing. And it will get worse.

What does surprise me is that old media don't lean into it more, because it would reinforce their status as gatekeepers IF they play it right.


What makes you think they haven’t been leaning into it for years? lol they don’t even have reporters any more for most news outlets, they just buy content.

You can catch media using photos from other events all the time for news articles.


Apologies. I may have not expressed my thoughts clearly.

What I meant was that if legacy media managed to position itself as 'organic only' source, it could benefit from renewed trust as people will try to seek out 'non-ai' content. Now, that would require humans ( and money ).

The natural question is whether, as a species, we would rather fool ourselves with.. w/e we want generated in our search, or read a verified account.

<< What makes you think they haven’t been leaning into it for years?

I think.. to answer the original question, I think that because, if that is true, they have failed hard.


Remember when Facebook censored true stories about COVID vaccine injuries because it was “malinformation” — at the request of federal agencies?

This will enhance that.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: