• 0 Posts
  • 108 Comments
Joined 1 year ago
cake
Cake day: August 26th, 2023

help-circle






  • If the first gen of migrants doesn’t integrate properly. Like be able to speak the languages at a decent level so they can get a job. It will have cascading negative effects to the next generations. It will affect the next generation because they are more likely to grow up in a household that is poor, where they don’t speak the language of the host country which will negatively affect their school performance and where they don’t teach their kids the culture of the host country so these kids will always feel like outsiders.

    This basically happened with the wave of migrants from North Africa and Turkey that were brought over to Western Europe as labor after WWII. The first gen didn’t cause much trouble, they just did their job and went home everyday, even though they didn’t integrate well. Since everyone thought including the migrants that they would return home after a few years. But it was the generations after where we see the negative effects of this failed integration. High school dropout , illiteracy, jobless and poverty rates are much higher than other groups including other migrant groups. And as a result crime rates including organized crime are also disproportionately higher among the descendants of those migrants from North Africa and Turkey. And also there is a lower sense of belonging among these migrants.

    We basically have to make sure the first gen understands what it takes for their kids to thrive in their new homeland. And that means integrating, learn the language, understand the culture. And the government must prevent enclaves.

    I’m a 3rd gen Asian migrant in Europe and went to school with many 2nd gen migrants from Morocco. I’ve seen this first hand. Many of these kids were basically behind in school and mostly because they didn’t speak the language well. And even as adults they still don’t speak it properly. I only know a few of them who went to uni. And the parents of those few spoke the host language at a decent level.



  • It’s not a bug. Just a negative side effect of the algorithm. This what happens when the LLM doesn’t have enough data points to answer the prompt correctly.

    It can’t be programmed out like a bug, but rather a human needs to intervene and flag the answer as false or the LLM needs more data to train. Those dozens of articles this guy wrote aren’t enough for the LLM to get that he’s just a reporter. The LLM needs data that explicitly says that this guy is a reporter that reported on those trials. And since no reporter starts their articles with ”Hi I’m John Smith the reporter and today I’m reporting on…” that data is missing. LLMs can’t make conclusions from the context.