I was curious if any other country had shows where they talked to or interviewed tourists as much as Japan does? I know the US doesn’t.
Per Wiki France is the most visited country for international tourists. I just can’t imagine French TV shows running around interviewing people about how great France is.
Thailand has more visitors then Japan. TV host – “Excuse me! Why are you visiting Thailand?” Old white guy – “ahhh..” runs off.
Not trying to start an argument, just curious if this was another one of those Japan only things.