Trump seeks to "paralyze" the "Privacy and Civil Liberties Oversight Board" (PCLOB). A key element of the EU-US data transfer deal ("TADPF") that allows EU-US data flows.
That’s not the only way to do it. In quite a lot of situations you can, instead, generate artificial data that is statistically similar to the original data set and use that instead. That works well for things like system testing, performance tuning and integration testing. Done right, you can even still pull out useful corelations without risking deanonymising the data.
There’s plenty of techniques to avoid re-identification… aggregation isn’t the only way. Especially considering that aggregation if using a stupid dimension isn’t helping at all…
Short of aggregating it to get rid of the individual records completely, “anonymizing data” isn’t actually a thing.
That’s not the only way to do it. In quite a lot of situations you can, instead, generate artificial data that is statistically similar to the original data set and use that instead. That works well for things like system testing, performance tuning and integration testing. Done right, you can even still pull out useful corelations without risking deanonymising the data.
There’s plenty of techniques to avoid re-identification… aggregation isn’t the only way. Especially considering that aggregation if using a stupid dimension isn’t helping at all…