Categories
All in the Mind Power Dynamics

Creepy Words

Word salad

In late 2020 I began to notice a curious extension to the once harmless word access, both as a noun and as a verb, during the concerted vaccine awareness raising campaign. Covid sceptics had warned early on that the authorities would make regular genetic code injections a condition for participation in mainstream society with the introduction of digital health passports. Ultimately, they would be tied to Central Bank Digital Currencies (CBDCs) to make life practically impossible for citizens without means of independent subsistence. Alas, TV pundits and online influencers seemed more concerned about access to vaccines in the context of the equally Orwellian concept of vaccine equity. I can understand notions such as access to clean water, access to reliable electric power and even Wi-Fi access. Water is essential to life on earth while electricity and modern telecommunications can greatly enrich our lives. Yet we only crave access to things we need or enjoy. Nobody demands access to something they do not want. That demand must be manufactured. This intriguing juxtaposition of words implied that people might suffer because of a lack of a new pharmaceutical intervention that had never before been tested on billions of human beings. Who decided that we needed to have access to these new concoctions? Did we ever have any proof that we could no longer survive without them? The EU signed a €71 billion contract with Pfizer-BioNtech to buy 4.6 billion injections or more than 10 for each EU resident and sent hundreds of millions of jabs to Africa that went largely unused. There was never a shortage or a lack of access to something most people did not need. There was only ever a huge glut and massive overspend on coercion and enforcement. What people wanted was access to workplaces, bars, restaurants, sports venues, hotels and holidays abroad for which they needed proof of covid-19 vaccine compliance.

Language evolves all the time as it adapts to new social and technological realities. This is perfectly natural. Our forebears did not have snappy words for electronic pointing devices (mice) or personal digital assistants (pads or tablets) because they had not been invented yet. Neither did we have generic terms for someone we may employ to help us keep fit, as in personal trainer, or take our dogs for a walk when we’re too busy, as in dog walker. In the 1960s, the latter job title may have been comprehensible, but few would have seen such everyday tasks as career options. It stands to reason that language tends to change faster in times of rapid societal transformation. Cultural continuity helps us stay in touch with past generations and learn from history. Once the past becomes a foreign country with an unintelligible language, the managerial classes can more easily rewrite history and manipulate the masses. While the English language has coined thousands of new words, often with Greco-Latin roots, since the industrial revolution, some core concepts have remained cultural constants. Their pronunciation and dialectical variants may change, but the basic ideas stay the same. All languages have words for man, woman, child, mother and father. They correspond to the fundamental roles we play in procreation and in raising the next generation. Whether you are male or female was, until very recently, a matter of easily verifiable biology. Our ancestors may not have mastered the science of chromosomes, but we understood only women can give birth to children and only men can impregnate them.

Words like customer, mental health, protection, safety and access may seem innocent enough. They are hardly newcomers to our language, but the ideas and feelings they convey have mutated, sometimes out of all recognition. A customer used to be someone who chose to pay for a service or product. If you don’t like a product or service, you can always take your custom elsewhere. Today, it often means a service user, with no choice over whether to use the service or not. Mental health has progressed from a general concern over someone’s emotional wellbeing into a pervasive intrusion into people’s private lives and inner thought processes. Protection no longer refers only to sensible measures you make to ward off physical harm, but a temporary immunity from prosecution. More creepily, safety no longer refers to voluntary protection from danger, but to artificial isolation from our natural environment. An obsession with a narrow aspect of relative safety can expose people to greater danger. Leaving a frail elderly person with mild dementia home alone without physical contact may reduce the spread of infectious diseases, but increases all other causes of ill-health, not least through loneliness.

Let us return to the creepiest case of semantic drift, namely access. Traditionally, the word was much more common in technical or formal usage. In everyday speech, we opted for simpler or clearer expressions. There may be many reasons why you cannot or may not visit a restaurant. If it serves alcohol, there may be minimum age for minors unaccompanied by adults. It may be hard to reach, possibly involving a strenuous walk up a long and winding path. Its owners may have banned you because of previous misbehaviour. You may have to present a racial purity pass or a digital health certificate to enter the premises and, of course, you may not be able to afford the bill. The bland term access now covers all such eventualities. An accessible restaurant would be open to all, affordable, have facilities to accommodate people with physical disabilities and cater for all dietary needs and preferences. Such a restaurant is unlikely to be very special. If you travel to a Tuscan hilltop village to visit a rustic steakhouse with a bespoke selection of locally sourced seasonal vegetables, you should hardly complain that is not accessible to wheelchair-bound cash-strapped vegans allergic to almost everything on the menu. That’s not their market. The restaurant is not in the business of being accessible to all and sundry, but of catering to a niche clientele who go out of their way to sample a unique culinary experience away from the madding crowd. Accessibility is not always good. Mountaineers do not climb Kilimanjaro, Aconcagua or Mount Everest because they’re accessible, but because they are the ultimate challenge. Their beauty lies in their inaccessibility.

Accessibility often appears in official jargon alongside other deceptive buzzwords like equality and diversity. Despite all the anti-discrimination rhetoric and legislation, the wealth gap has never been larger and culture has never been more homogeneous around the world. Likewise, accessibility initiatives seldom empower the poor and vulnerable to gain access to venues erstwhile reserved to the lucky few, but rather adapt services targeted at the disadvantaged. Don’t get me wrong, I’m all for helping the physically disabled to lead more independent and rewarding lives, where feasible, but you can rest assured that if the ruling classes do not want us to access certain publicly funded venues, they will find other means to exclude us, usually under the pretexts of safety or security. They only care about access to things they want us to use. It hardly comes as a surprise that following the overturning of the landmark Roe v. Wade ruling on the availability of legal abortion in the United States and recent calls for tighter restrictions on abortion in Italy, we now hear talk of access to abortion. As more and more jurisdictions extend the scope of legal euthanasia from a practice reserved for the terminally ill suffering from excruciating pain to people with mental health challenges, talking heads have already started to complain about lack of access to safe and effective euthanasia services. Anything, no matter how immoral, seems so much more palatable when dressed up in health-and-safety verbiage. There is nothing safe about death and nothing good about access to tools of biotechnical subjugation.

Categories
Power Dynamics

How the Blair Era begat Corbynism

Tony Blair and Jeremy Corbyn
And how powerful forces commandeer youthful idealism to further totalitarian aims

Labour's new army of social justice warriors have learned a bitter lesson. While they may appeal to some special interest groups and social service professionals, they have lost touch with their base outside a few culturally diverse inner-city areas. As the results of the snap December 2019 election poured in, it turned out Labour's vote share of around 32% was not quite as low as many of its supporters may have feared. Let's get things into perspective, in 1983 under Michael Foot Labour's vote plunged below 28% and in 2005 under Tony Blair Labour managed to win a comfortable majority on just 35% of the vote. In both 2010 and 2015 under fairly orthodox centrist leaders Labour polled just 29% and 30.4% of the vote respectively. However, with a radically changed demographic the quirky arithmetic of the First Past the Post electoral system now works against them and favours the Tories and SNP.

Two cheeks of the same Backside

It's hardly a coincidence that both Blairism and Corbynism hail from the trendy inner London borough of Islington with sky high property prices and extremes of wealth. Tony Blair and Peter Mandelson lived there in the 1990s as they planned to take over the Labour Party, as do Jeremy Corbyn, Emily Thornberry and Jon Lansman, the architect of the Momentum cult. Many Blairites had been Trotskyists, Maoists or Stalinists in their youth. They just recognised the need to embrace big business and strategically support the projection of US-led cultural and military hegemony. However, their goal has long been a technocratic one-world government that suppresses true cultural diversity and undermines the last vestiges of self-determination. Their apparent differences centred on short-term strategy, mainly support for destabilising global policing operations and endless debate about the Israeli/Palestinian conflict. Broadly speaking both Blairites and Corbynites come from the same privileged social class with an entourage of token working class acolytes. The sycophantic Blair babes of the early 2000s seemed to have now been replaced by a new breed of wishful thinkers such as Rebecca Long Bailey and Jess Phillips. It may be hard to understand the common purpose of Jeremy Corbyn, who rebelled against all of Blair's military escapades, and Tony Blair, who joined forces with George W Bush to invade Iraq. To the architects of a borderless new world order, these military conflicts serve mainly to destabilise nation states. Indeed they may welcome the destabilisation of Europe and North America in the 2020s as much as they relished the dismemberment of the Middle East and Central Asia in the first two decades of this century. Periodically they will engineer a changing of the guard, so the new management team can dissociate itself from the mistakes of the previous leadership. You can't get more pro-establishment than Nick Clegg, former Deputy Prime Minister and leader of the Liberal Democrats, now working for Facebook. Yet in public he claimed to have opposed UK involvement in the invasion of Iraq to earn street credibility among disillusioned Labour voters. Admittedly I almost voted LibDem myself before I fully grasped the consequences of the cultural revolution that started under Blair and has continued ever since under the fake Conservatives.

How Millennials who grew up under Blair embraced Cultural Marxism

Back in the mid 1990s I wrongly saw Tony Blair as the heir to Thatcher. Indeed, New Labour embraced Thatcher-era privatisation, expanded private sector involvement in the National Health Service via controversial Private Finance Initiatives and continued to outsource more and more public services to commercial service providers. It steadfastly refused to nationalise the railways, but committed to the renewal of Britain's ageing nuclear deterrent and eagerly assisted the US military industrial complex in its global policing operations in the Balkans, Africa and in the Middle East. Tony Blair enjoyed being a loyal sidekick of Presidents Bill Clinton and George W Bush alike. Many traditional Labour supporters opposed these policies from the left. The late 1990s were in the context of the impending cultural revolution a historical hiatus. Seven years after the Soviet Union had disbanded and 4 years after South Africa inaugurated Nelson Mandela as its first black President, representatives of Northern Ireland's warring factions, including Sinn Fein and the Ulster Defence Association, agreed to a ceasefire in the much heralded 1998 Good Friday Agreement, brokered by Labour's new Secretary of State for Northern Ireland, Mo Mowlam, but building on negotiations that had begun under the previous Conservative government. As devolved parliaments opened in Scotland and Wales, we seemed to be on the verge of a new era of greater social peace and prosperity, while retaining our cherished personal freedoms and cultural heritage. Alas few observers fully appreciated the scale of the impending cultural revolution as fewer and fewer young adults could get on the housing ladder and all too often succumbed to culture of hyper-dependence.

While Thatcher appealed to traditional family values and championed small businesses, Blair appealed to pop culture and embraced the entertainment business. In reality Thatcher-era reforms had not just destroyed millions of stable manufacturing jobs, they had prevented many young men from marrying and starting families as the primary breadwinner. Attitudes to traditional marriage had begun to change in the swinging 1960s, but the demise of secure jobs for working class young men without good academic qualifications meant many young women turned to the state rather than marriage to help them fulfil their natural desire for motherhood.

Despite the hype the all-powerful state has never really receded, it's merely handed over some of its operations to unaccountable large corporations with labyrinthine management structures, while expanding in other areas, most notably in social surveillance and welfare provision. Contrary to popular perceptions, public spending rose in the first years of the Thatcher government and only declined as a percentage of GDP in the more prosperous late 1980s as the economy grew and unemployment fell. In no year since 1946 has public spending fallen in absolute terms, even accounting for inflation. What really matters more than the proportion of the economy under direct state control is personal independence or the extent to which we are masters of our own destiny or beholden to external agencies. Over the last four decades a growing proportion of our income goes not to life's essentials but to rent, mortgage payments, loan repayments, insurance, commuting and various communication and entertainment services we never used to need. It's often much easier to divorce your spouse than to end legally binding contracts that limit your personal budget to a mere fraction of your theoretical net earnings. We spend much of our remaining disposable income in a handful of supermarket chains and other retail outlets, restaurants, pubs, gyms, leisure centres and clubs controlled by big business. In the early 2000s I had to reassess my earlier analysis that our ruling classes wanted to roll back the state leaving only bare bones public services for the poor. Instead it dawned on me that a growing underclass was trapped in a vicious cycle of welfare dependency, substance abuse, family breakdowns and myriad emotional challenges interpreted as mental illnesses. To escape this trap, you'd expect government agencies to help young adults gain the kind of skills that today's high-tech job market needs. Yet they only ever made half-hearted attempts at workfare and seemed quite happy for an influx of Eastern European migrants to fill vacancies that local youngsters could have snapped up with the right incentives, thereby denying hundreds of thousands of young adults of an opportunity not just to gain critical work experience, but greater personal independence.

One of New Labour's flagship policies, besides the national minimum wage, was the introduction of working family tax credits. On paper this sounded like a great idea making low-paid jobs pay and helping young families make ends meet. In practice it subsidised penny-pinching employers and naturally redefined families as any combination of adults and children who live together. More significantly, these new benefits were available to all EU citizens no matter how long they had lived in the UK or paid into the system. One of the main reasons many youngsters from deprived neighbourhoods in the North of England do not move to London to take advantage of a buoyant labour market and higher wages in the bustling service sector are sky-high rents and the hurdles you have to cross to gain access to housing benefit. I know from talking with many Eastern European bar staff that recruitment agencies and their extended ex-pat community would often help with shared accommodation for new migrant workers.

I still think New Labour missed a golden opportunity. From day one they should have fulfilled their promises by investing heavily in technical skills in the most deprived communities of their former industrial heartlands and weaned the welfare-dependent underclasses off benefits not through uninspiring temporary jobs, but through re-training and a culture of creative innovation. They could have expressed their love of continental Europe not through slavish devotion to a federal superstate, but by emulating German and Dutch vocational colleges and subsidised apprenticeships. If we lack good plumbers, mechanics, electricians, nurses and doctors, surely we can train our own. That doesn't mean we can't have exchanges with other countries, it just means employers don't have to keep recruiting from abroad because of a dearth of qualified candidates locally. However, it should now be abundantly clear the government's senior policy advisors had no intention of empowering the local working classes. As Andrew Nether revealed, they wanted to rub the right's nose in diversity, but their definition of right-wing did not mean a small band of wealthy stockbrokers and aristocrats, but rather the socially conservative native working classes. If you did not embrace our new multicultural reality and were not involved in our growing media, marketing and social engineering sectors, our globally minded managerial classes considered you an ignorant country bumpkin at best or a racist thug in urgent need of psychiatric treatment. By multiculturalism, they did not mean respecting the many cultures that have evolved gradually over many generations in different parts of the world, but rather a post-modern reality of parallel ethnoreligious communities struggling to intermingle and cope with cultural convergence in their new neighbourhoods alongside other groups of newcomers. Their idea of diversity is a wide range of ethnically themed restaurants, boutiques, dress codes and skin colours, more chicken tikka masala than smörgåsbord, but a convergence of lifestyles submerged by mass-marketed universalism. To the cheerleaders of fake diversity what matters most is helplessness, namely complete dependence on external authorities. They see identity groups as constituents thankful for more social surveillance to keep the peace. It hardly matters if Christian Afro-Caribbeans value traditional two-parent families or young English gay party revellers distrust Islamic fundamentalists in their neighbourhood, everyone is supposed to unite in their superficial diversity.

Who's behind Cultural Marxism?

Just as Blair built on many Thatcher-era policies favouring big business interests, Cameron and May continued New Labour's cultural revolution, with key public policies emanating not from nominally Conservative politicians, but from corporate thinktanks and NGOs. Increasingly over recent decades large corporations, nominally in the private sector, have promoted dysfunctional lifestyle choices and fake diversity.

You need only watch advertisements for leading retail outlets. They would once portray typical families broadly representative of their customer base, but today they clearly go out of their way to overemphasise diversity, often showing happy households with mixed race gay parents enjoying a meal with their Muslim neighbours. Rather than simply reflecting reality on the ground, advertisers seek to drive cultural change by presenting a rose-tinted glimpse of our projected future.

Take for example the controversy over self-identification of one's perceived gender, which featured in both the Labour and Liberal Democrat manifestos. This is still a fringe issue that concerns only a few confused individuals who have been persuaded to attribute their psychological challenges to a redefinition of gender roles. It turns out the Liberal Democratic Party had accepted a large donation from a pharmaceutical multinational that produces puberty-blocker drugs. Yet we are somehow led to believe by the virtue-signalling echo chamber of social justice warriors that the campaign for transgender rights comes from grassroots activism and not from corporate lobbyists. This begs the question as to why businesses that theoretically want to make profits and expand their clientele should invest so much money promoting lifestyle choices that greatly limit personal independence? It's because they need captive consumers more than conscientious workers who actually provide the products and services we need.

Cultural Marxism has only taken root in the millennial generation because both academia and big business actively promote it. You're hardly rebelling against the system if you're faithfully recycling talking points coined by advertising agencies. When I worked as a contractor in the offices of Saatchi and Saatchi (the advertising agencies behind the election of both Margaret Thatcher and Tony Blair), marketing staff would religiously read the Huffington Post and embrace the key tenets of identity politics. Social conservatism is much more prevalent in working class communities. Indeed the coming years may see the emergence of new and unexpected alliances to resist top-down social engineering. I think most parents across the British Isles disagree with gender theory lessons in primary schools. At least they would do, if they understood what their sons and daughters were learning in deceptively childish language. When mainly Muslim parents protested the No Outsiders programme in Birmingham schools, the mainstream media tried to dismiss these protests as the bigoted views of religious fundamentalists. If you watch Channel 4, you may have welcomed a break from the usual derision of the xenophobic white Anglo-Saxon working classes to focus instead on transphobic and homophobic Muslims. Sooner or later the parallel ethno-religious communities of our big cities may actually find common cause to resist wokeness and stand up for common sense. Finally people on the ground may realise how disparate groups are being played against each other.

What will happen to Momentum?

I suspect the masterminds of the Corbyn Cult knew full well they could never really win over the working classes outside their metropolitan bubbles. That's why they cheer on the proliferation of new welfare-dependent communities in large cities and the ethnic cleansing of many towns. If only we could extend the vote to 16 year olds or let new migrants vote before they've gained full citizenship? If only we could encourage more apolitical misfits to opt for postal votes in the hope they will opt for the nice party promising them more free stuff. While Blair's spin doctors knew how to appeal to the core Labour vote with platitudes about better education and job creation, Corbyn's handlers offered only patronising charity and public spending commitments they clearly could not honour. However, Momentum will not disappear, it will merely morph into a permanent vanguard movement driving dysfunctional lifestyle changes that ultimately serve the interests of the same big businesses, who for now, are happy with Boris Johnson's fake Conservatives. The only consolation prize is we may have at least exposed the true agenda of global totalitarians.

Categories
All in the Mind Power Dynamics War Crimes

The Abolition of Britain and the rise of Global Governance

How the quest for greater independence is being usurped by power-hungry control freaks

I make no bets on the outcome of the snap General Election scheduled for 12th December. Last time a healthy Tory majority seemed almost certain until a couple of weeks before polling and after a disastrous Conservative election campaign. For the first time in recent history Labour did much better than expected. My hunch is Boris Johnson's party will win a comfortable majority of seats because the core working class electorate have lost all faith in Labour, but I doubt the resulting managerial team will do much to protect British workers from the excesses of globalism. I hope the government's ineptitude may oddly strengthen the resilience of ambitious youngsters as they realise the state will not help them fulfil their dreams and thus avoid succumbing to a prevailing culture of victimhood and entitlement.

We may well see another shift among the affluent managerial and business classes from the Tories to the misnamed Liberal Democrats (or the illiberal unDemocrats as I call them), while many traditional Labour voters either sit at home, strategically vote Conservative or flirt with the Brexit Party to keep out Labour, whom they now see as the party of unlimited mass migration, toxic identity politics and undeliverable spending commitments. However, in Scotland Labour will lose out not only to the Conservatives, but to a resurgent SNP capitalising on fashionable anti-English sentiment. They see Brexit as the brainchild of English Tories eager to resurrect the British Empire. If we assume current polling is correct, the political map of mainland Britain will be split into four. The Tories will dominate English shires and towns, the Liberal Democrats will do well in the most affluent neighbourhoods, while Labour will keep most of its metropolitan strongholds among its special victim groups, welfare-dependents, social engineers and trendy students. By contrast, owing to the vagaries of the First Past the Post system, Nicola Sturgeon's cult movement look set to snap up most Scottish seats, as the anti-SNP vote is too evenly split. The Brexit Party will be lucky to gain 1 or 2 seats in former UKIP strongholds, but they may succeed only in letting Labour hold on to a few more marginals.

The ongoing Brexit saga amid yet another General Election with very uninspiring choices has revealed two unwelcome realities. First most nation states have limited independence from global banks and corporations, supranational institutions and a well-funded network of nominally independent non-governmental organisations (NGOs) posing as humanitarian charities. Second, and perhaps more important, it has exposed what our ruling classes really think about democracy. If they cannot persuade the great unwashed masses to endorse their social engineering plans by electing a bunch of middle managers who will cooperate with the agents of change, they will destabilise your country and have you begging for their intervention.

Whatever the relative merits of the European Union may be, the outcome represented a huge kick in the backside for the metropolitan elite, who for decades have presided over the steady transfer of power from time-honoured local institutions to more remote international entities in the name of progress. Let us be under no illusions the EU is only a means to an end, not the end itself. There are many good reasons to welcome close cooperation among Europe's disparate peoples to protect our cultural heritage and defend us against the worst excesses of what we once viewed as neoliberal globalism, especially as a counterbalance to the North American and Chinese models with their extreme forms of plutocracy. Just 15 years ago in the aftermath of the joint US and UK occupation of Iraq, many of us wanted to distance ourselves from the British and American foreign policy establishment. Many of us hoped a Europe Community of independent peace-loving and democratic nation states with strong protections both for personal freedom and social justice could offer an alternative to Anglo-American capitalism.

While many other countries appeared insecure and in imminent danger of fragmentation, civil war and greater subjugation to imperial forces, Britain seemed impervious. Only the Northern Irish conflict ever posed a security threat, although behind the scenes the British Civil Service has long viewed the province as more of a burden than a strategic asset. Scottish and Welsh nationalism remained relatively tame disputes, quibbling mainly about the extent of autonomy within the United Kingdom. Few thought any major part of the UK would join another major superstate. The Republic of Ireland has since its inception remained steadfastly neutral, so even if Northern Ireland voted to join the Republic, there would be no fundamental shift in the balance of power. Leaving aside widespread opposition to the deployment of the Trident nuclear missiles in Faslane just northwest of Glasgow, Scotland has long been way too reliant on tight integration with the British military industrial complex for mainstream politicians to advocate military independence from the rest of the UK and from NATO, although this was the official SNP position until 2012.

Sea Change

Before around 2012 the European issue seemed very much off the radar. Transnational bodies like the EU, NATO and the UN were just facts of our increasingly internationalised lives, but not things we felt affected our everyday lives. Broadly speaking most Europeans opposed further centralisation preferring to keep control of economic, social and military policy at a more accountable national level, but many still believed our politicians somehow represented our interests at various international gatherings. We saw this in referendums in Ireland, France, the Netherlands and Denmark where voters rejected new treaties (respectively of Nice and Lisbon) only to see their votes either ignored or to be forced to vote again after cosmetic changes. However, we could also argue that the public have grown so disillusioned with the sorry state of national politics that they'd rather place their trust in shiny new progressive institutions that transcend traditional boundaries. For decades the establishment media has tried to persuade Europeans that they can trust the EU and NATO more than their local regimes with their chequered history of corruption and despotism. In the early 1970s not only was most of Eastern Europe behind the Iron Curtain, but Greece, Spain and Portugal still had military dictatorships appealing to traditional Christian values to ward off the dual dangers of Eastern communism and Western decadence. Not surprisingly millions of younger Europeans welcomed the fall of these paternalistic regimes and embraced a new era of mass consumerism combined with a comfortable social safety net. While millions of Greeks, Spaniards and Portuguese may be critical of the budgetary constraints imposed on their governments to keep alive the Euro, they still tend to associate the EU with the greater prosperity they've enjoyed since the 1980s. The situation in Britain is very different. The golden era of the British working classes was the 1950s and 60s. Sure we lacked many of the modern conveniences made more affordable by recent technological progress, e.g. many had outside toilets, coal fires instead of central heating and cars were still a luxury for many, but what mattered most is that the relative quality of life was steadily improving with a high level of upwards social mobility. A typical school leaver could aspire to getting a decent skilled job as an apprentice and earn enough to be able to marry, buy a house and start a family by his or her mid to late twenties, all without welfare handouts. We hoped progress would empower families to lead more independent lives while still enjoying the fruits of a civil society with a high degree of social trust and mutual respect. Little did we know that many of our mission-critical jobs would be first outsourced and then automated as big business had to rein in the collective bargaining power of trade unions. The long-drawn-out demise of British industry, kept on life support during the 1970s, weakened the resolve and resilience of the working classes, blamed for demanding unmerited pay rises, being too lazy and lacking the industriousness of their European and Japanese colleagues. Yet to this day, many observers simply blame Thatcher for turning off the life support machine that squandered countless billions on trying to save outdated industries that could not survive the challenge of global competition able not only to tap into a seemingly limitless supply of cheap labour, but to quickly close or retool outdated manufacturing facilities with little regard to job security.

I noticed even as long ago as the 1979 General Election that saw Margaret Thatcher's Tories win a healthy majority of seats, Labour had begun to shift its focus from standing up for workers' rights to championing welfare and public services. Thatcher managed to appeal to the aspirational working classes, the kind of people who wanted to own a house, drive a car, holiday in Spain and earn a decent living through a career in the growing service sector. While some workers adapted and some new light manufacturing outfits took the place of heavy industry, many youngsters in Labour's working class heartlands outside the more prosperous South East of England inherited the helplessness of their parents who had failed to adapt and thus became trapped on welfare or short-term jobs in call centres leading inevitably to dysfunctional households and social dislocation. Nonetheless a major rebranding effort saved the Labour Party as it embraced Thatcherite reforms, the information revolution and pop culture while promising not to raise taxes. I was an early sceptic of Blairite Magic. Somehow his soundbites lacked substance or analytical integrity, but one slogan stuck in my mind "Education, Education, Education". If you believed the hype, we were on the verge of a quantum leap in scientific excellence. The next generation would become talented doctors, inventors, bioscientists, software developers and robotics engineers. Alas very few did, but many more became recruiters, public relations officers, graphic designers, creative directors or worked on the peripheries of emerging high-tech industries in new-fangled specialisations such as forensic science or environmental science, learning how to engage with technologies that someone else developed to monitor other people's behaviour, market goods or ensure minimum health and safety standards. With such a dearth of tech-savvy innovators and entrepreneurs, British professionals have focused mainly on people management and persuasion, a sector encompassing not only advertising, public relations and entertainment, but behaviour and attitude modification through charities and education. For every engineer developing new technology to help us solve practical environmental challenges, there are many more climate change awareness officers or busy bodies lecturing parents on how to deal with tantrums without smacking. The net result is a dual culture of dependence, either on state handouts or on corporate largesse, and greatly reduced personal resilience. The first Blair government famously rebranded Britain as Cool Britannia, more about rock stars than scientific pioneers. Now the last gasp of British cultural innovation has been co-opted by the multibillion dollar entertainment industry and blended into a global culture disconnected from the specific locales of post-imperial suburban Britain. In the same period Global English has begun its shift from a high-status international language modelled on standard British or American English to a rapidly mutating form of NewSpeak inspired by a worldwide intelligentsia with little reference to the speech patterns of the transient residents of London or New York City. Native speakers have thus lost the relative advantage they once had over those who acquired the language later in life.

As a historical paradox the country that has given the world its dominant lingua franca now suffers from an acute identity crisis as progressive opinion leaders attempt to deny there is such a thing as a native English person. This mirrors trends in other European countries with almost identical claims going mainstream in Germany and Sweden too. National identity for many in cosmopolitan areas has been reduced to mere temporary allegiance to your country of residence in occasional sporting events.

What's left of Britishness anyway?

Many Ulster unionists are none too happy about Boris Johnson's deal to keep their province in regulatory alignment with the EU's Customs Union and Single Market with customs checks in the Irish Sea rather than along the meandering border with the Republic of Ireland. Increasingly only the Democratic Unionist Party defend traditional values, while Sinn Fein, claiming to represent the Catholic community, has recently endorsed positions on gay marriage, LGBTQ-friendly sex education and abortion perfectly aligned with the cultural left, but at variance with Catholic teachings. However, a growing proportion of the younger generation identify neither with Protestantism nor Catholicism and are very open to unification with what has become a secular Ireland. The British Deep State seems more concerned about the perceived Russian threat than subsidising Northern Ireland.

The begs the question whether the CEOs of UK PLC really care that much about the constitutional status of Scotland, now they know a nominally independent Scotland would both stay in NATO and join the new European Defence Union. Universalist media outlets treat Nicola Sturgeon's SNP much more favourably than the Brexit Party or even the Tory Party.

However, I sense a split between the Atlanticist and Europhile wings of British intelligentsia. Recent statements from Emanuel Macron, Guy Verhofstadt and the EU's new President Ursula von der Leyen have revealed a gradual shift from a unified European military command working within NATO alongside the USA to a European Army taking over from the USA in global policing operations in the Middle East, Africa and Central Asia. More disturbing is the growing hostility among the Western European elites towards Russia. In just a few years neo-conservative war hawks have shifted their lobbying operations from Washington DC to Brussels. To match US military spending, the Europe Union would have to double spending, something that would be very unpopular at a national level, but could only be justified by the spectre of a Russian and/or Chinese threat. Even if Trump is re-elected in 2020, US military adventurism has peaked. The federal government can no longer justify such a massive defence budget when they have bigger challenges at home with rapidly changing demographics. It's only a matter of time before someone like Tulsi Gabbard or Alexandria Ocasio Cortez becomes the president of a debt-ridden federation in a post-dollar world order, dominated by the Chinese and Indian economies.

Without Scotland and Ulster, England and Wales would be a very disunited place with London behaving more and more like a city state divorced from its geographic hinterland and parallel communities in many other towns and cities.

In all likelihood Boris Johnson's BRINO or Brexit In Name Only will avert Scottish Independence for a few years before other events overshadow it, Ulster quietly merges with a post-Christian Eire and the Scots turn against the SNP. Meanwhile continental Europe will struggle to cope with the fast pace of cultural and demographic metamorphosis, a looming banking crisis and an escalation of the civil unrest that has spread across France over the last year. We may just be able to salvage a federation of the British Isles, but with waning faith in traditional British institutions such as Monarchy (and far be it from me to comment on Prince Andrew's close friendship with American sex predator Jeffrey Epstein) this island seems ripe for Balkan-style destabilisation with the people's splat over Brexit serving as a trial run for a much deeper conflict over culture, identity and power.

Categories
All in the Mind

Language is more about Meaning than Words

Tower of Babel

I recently dived into an almighty row with a bunch of EU flag wavers replying to a message celebrating International Native Language Day. You see I'd like to celebrate it too, but the subtext implied EU citizens in the UK promote linguistic diversity. So let's think this through. Once abroad mingling with the locals and other newcomers, emigrants tend to neglect their own language unless it's widely spoken in their new country or they join a network of compatriots, in which case they're not integrating. Native English speakers are seldom inspired to learn other languages these days, as they can get by with English alone in many tourist resorts and cosmopolitan cities around Europe. Indeed in some places if you try to speak the local language, your interlocutor will answer in English, either because this comes as second nature to them or because they want to flaunt their proficiency in a more prestigious medium of verbal communication. No self-respecting go-getter wants to be written off as a country bumpkin unable to converse fluently in the global lingua franca. Like it or not, globalisation tends to strengthen strong languages to the detriment of weaker tongues. Some government agencies may pay lip service to local heritage by insisting on bilingual or trilingual signs, but unless all languages involved are actively spoken across multiple domains of everyday communication, they act as little more than an exercise in public relations. Some may try to deny this reality pointing to initiatives to revive endangered languages like Welsh, Basque or Romansh or embracing bilingualism as a way to reconcile the rather obvious conflict between worldwide cultural convergence and a desire to retain our diverse cultural heritage. I'm the first to stress the benefits of learning more than one language to expand your intellectual horizons and escape the semantic prison of monolingualism. A distinction that may seem crystal clear in one language is not in another, e.g. in English we have separate verbs for feeling, hearing and smelling, all referring to different forms of perception, but in everyday Italian all three can be sentire. Naturally context usually makes it clear which sensory organs perceive a phenomenon. By contrast English has two catch-all verbs, get and set, which cover a vast semantic range, often only understandable in context, i.e. with reference to other words, e.g. I got it may mean I understood the message or it may literally mean I obtained it depending what it is, which is partly why native English speakers tend to specify the names of common objects and concepts rather than resort to ambiguous pronouns (e.g. wash the dishes or do the washing-up are preferred as set phrases rather than grammatically and semantically correct constructs like wash them where them refers to the dirty dishes your partner just mentioned).

Not just Etymology

However, many amateur linguists fall into the trap of focusing solely on etymology. It helps us trace the cultural evolution of a language community through words alone. Most linguists would classify English as a West Germanic language with a large Graeco-Latin vocabulary, acquired largely through Norman French, supplanting or supplementing Anglo-Saxon words. Between the 10th and 13th centuries Old English underwent a rather dramatic transformation from a close cousin of Old Low German with three grammatical genders, five cases, inflected adjectives and a much more flexible word order to a simpler but more analytical tongue by Chaucer's time. We can't trace the exact progression of this metamorphosis as Norman French and Latin served as the main vehicles of written communication after the new Norman aristocracy had displaced the old Anglo-Saxon ruling class. However, etymologists fail to explain why Middle and Early Modern English had so few Celtic loanwords, but has diverged morphologically from contemporary languages spoken in adjacent regions of continental Europe. One would expect that a synthesis of West Germanic and French, itself evolving from the vulgar Latin adopted by former speakers of Gaulish, a Celtic language, would yield a language resembling Flemish in syntax and semantics. Yet middle English diverged not just from its Germanic and Romance cousins, but from Brythonic too (Cornish and Welsh spoken as far north as Cumbria and Galloway) in discarding grammatical genders and most inflexions (except plurals and the Anglo-Saxon genitive). Modern insular Celtic languages have two grammatical genders and a VSO (verb- subject-object) word order, unlike English which has a stable SVO order, and always place adjectives after the nouns they describe, unlike English where they usually precede the nouns they modify. Until recently anthropologists have offered two explanations. First that old English prevailed over autochthonous Celtic languages because of its higher prestige. Second that invading Anglo-Saxons drove the Celts to the western fringes of the British Isles. Brythonic dialects survived in Cornwall until the late 18th century and in Cumbria until 12th century. However, examples of written Insular Celtic predate the earliest records of written Germanic. Although most literature in the Roman period was in Latin, a tradition that continued in academic circles for many centuries thereafter, Celtic inscriptions can be found over much of Western Britain, but not in most of what later became England. As Christianity spread to the British Isles between 5th and 7th centuries mainly from the Celtic West, another clue that the Anglo-Saxons did not culturally eclipse the extant Celtic civilisation, but genetic evidence suggests they did not supplant the local population either. In over 200 years Anglo-Saxon migrations from continental Europe would add around 5% to the gene pool. Analysis of the haplogroups extracted from the Y-chromosome DNA of skeletons reveals gradual migratory patterns responding largely to environmental changes. As much as 75% of the gene pool of the settled British population, before recent waves of migration since the 1950s, can be traced to settlers who arrived in these Isles between twelve and four thousand years ago. Subsequent migrants added to the gene pool and assimilated over a long and protracted period. In Roman times the ruling classes and their foot soldiers made up little more than 1% of the population. This begs the question: Why would a mainly Celtic-speaking people abandon their native tongue in favour of a newly imported Germanic language that lacked the prestige of Latin, while leaving few traces of the Brythonic vocabulary or syntax, something we'd normally only expect to happen in the event of large-scale ethnic cleansing, which is alas unsupported by the archaeological evidence?

Substrate Languages

Arguably languages do not so much disappear as fall into disuse as the descendants of the original speakers adopt more prestigious speech registers. It took vulgar Latin around 800 years to supplant Gaulish and the pre-Indo-European Vasconic tongues of Aquitane (related to modern Basque). This followed a process of gradual acculturation of illiterate commoners with the more erudite urbanites, who had already adopted Latin. We see a similar process today in many of Africa's burgeoning metropolises where newcomers discard native African languages in favour of street slang based on a mix of the official language (English, French or Portuguese) with morphology and phrases borrowed from their ancestral languages. In France Latin evolved into French and Occitan. In Iberia it morphed into Catalan, Spanish and Portuguese. Yet in none of these regions was Latin or even a closely related Italic language the dominant tongue before Roman colonisation. Moreover, in none of these regions did the Romans displace most of the indigenous peoples, who gravitated over many centuries to more prestigious modes of communication contributing to a radical restructuring of Latin's successor languages, e.g. Latin had three genders, 6 cases and an underlying Subject-Object-Verb (SOV) word order, more akin to Sanskrit or old German than to modern Italian or Spanish. So Gaulish and Aquitane live on as substrata of modern French influencing not only its morphology, but its semantic range, e.g. the French penchant for counting in twenties with baffling forms such as quattre-vingt-dix for ninety has its roots in Gaulish and mirrors the old Welsh pattern of only having second tier numerals for twenties rather than tens (e.g. thirty one would be twenty eleven). Oddly most words of Celtic origin in modern English were not borrowed directly from Welsh or Gaelic, but came to us via French (e.g. ambassador, beak, brave, budget, car, cream, change, embassy, glean, gob, piece, quay, truant, valet, vassal etc.). So why would Celtic exert more influence on Old French than on Early and Middle English? Did Anglo-Saxon invaders succeed in persuading the natives to ditch their mother tongue completely where the Romans had failed? However, there is an alternative hypothesis that displeases Celticists and Germanicists alike: Most of the ancient tribes of Roman and pre-Roman Eastern Britain may have spoken pre-Indo-European rather than Celtic languages, which were later came into contact with tribes speaking a purported fourth branch of the Germanic family, once spoken in the Low Countries, thus facilitating the adoption of a lingua franca based on Anglo-Saxon, but with significant pre-Indo-European substrata. Modern Dutch and Flemish are based on Frankish dialects of West Germanic, which may have supplanted fourth branch dialects. More important surviving Old English manuscripts may well reflect an erudite variant of insular Anglo-Saxon rather than common English dialects. Contrary to popular belief, Roman and Greek scribes of the era did not usually identify the origin of the thousands of indigenous tongues they encountered so much as the tribes that spoke them and their relative mutual intelligibility. Greek scholars first applied the terms Keltoi to refer to tribes of Dacia, a region now straddling modern Bulgaria and Romania, long before the Slavic expansion. Besides the Celtic languages of the Western British Isles may have themselves displaced earlier pre-Indo-European languages, so both the Celtic and Germanic dialects spoken in the British Isles evolved atop substrate tongues spoken by illiterate indigenous tribes. This leads us to another bone of contention. Did the Picts of Northern Scotland and Ireland speak a Celtic language or did they, as some scholars suggest, speak a pre-Indo-European tongue? All we have to go by are place names and Ogham inscriptions. Scottish Gaelic, which prevailed in the Highlands and Islands until the Highland clearances of the 18th century, came to Scotland from Ireland between the 5th and 8th centuries. To complicate matters further the Picts may have borrowed much of their later vocabulary from Brythonic (precursor to Welsh) before merging with Gaelic under the rule of Dal Riata. A common mistake many linguists make, especially when only limited textual sources remain, is to analyse only the etymology of words that resemble cognates in other known languages. This often leads to false positives. Just because the word for king in language A has a cognate in language B does not mean that language A borrowed the word from language B or that both languages evolved organically from a common ancestor with gradual changes with pronunciation and meaning. It's often more likely that both languages borrowed the word at different times from a more prestigious tongue that may have since lost its pre-eminence.

As we see today with the proliferation of English-like words, neologisms and trademarks in the world's 7000 surviving native tongues, we cannot judge a language merely by the origins of its commonest words. English has gained over two thirds of its vocabulary since the earliest literary works of Old English, but before its expansion in the colonial era English evolved as the lingua franca of the peoples of England, Southern Scotland (where it was once known as Inglis before being renamed Scots when the Scottish aristocracy abandoned Gaelic), the Pale around Dublin and parts of Wales. A language is thus a speech code handed down through generations and shared within a community. Speakers are free to borrow words from other languages and integrate them creatively into their own, assigning new meanings and combining loanwords with other words to express new concepts and nuances. What matters most is mutual intelligibility and cultural continuity, providing a frame of reference for shared experiences and history. Cultural discontinuity occurs only through ethnic cleansing, mass migration or colonial repression of native cultures. However, unless a people is completely eradicated, their ancestral tongues are still likely to influence the way they speak their new language, especially before the advent of universal compulsory schooling.

The Quirks of Insular English

English syntax differs from its continental neighbours in a few important respects. Many scholars have explained the language's rapid transformation following the Norman conquest by its demotion to a vernacular spoken mainly by illiterate peasants. However, why did this not happen to many other European languages, which were seldom written before the Renaissance. Latvian successfully retained its highly inflected grammar despite only gaining a sizeable literature during the Latvian Awakening of the late 19th century.

  1. English has a rich variety of verb tenses with auxiliary verbs that convey important semantic distinctions between progressive and simple tenses, e.g. I play versus I'm playing as well between I've played and I played. Spoken French, Dutch, German and Northern Italian have all converged on a simpler range of tenses with a strong preference for simple forms for the present or near future and the present perfect (e.g. I have done) for past events. By contrast modern Celtic languages use continuous tenses for both progressive and simple actions. English distinguishes a general statement such as She plays the guitar, implying a habitual activity performed with some degree of competence, from She's playing the guitar merely describing her current activity.
  2. English uses the present perfect for events that started in the past but are still ongoing. Other European languages always use the present tense unless the referenced event has finished, e.g. "I've been waiting two hours for the bus" means I'm still waiting. Otherwise we would say "I waited two hours for the bus".
  3. English has lost grammatical genders and cases, but retains gender-specific pronouns referring to people, some animals and occasionally to personified objects (e.g. referring to a country, ship or car as she/her). This loss is not unique to English. It happened to Bengali, Armenian and Afrikaans too, but we cannot explain it simply by its temporary demotion to a vernacular or by the influence of neighbouring Celtic languages which have all retained grammatical genders.
  4. To maintain a consistent SVO word order, English uses auxiliary verbs for questions and negations, e.g. Did you wash the dishes? and I didn't wash the dishes. In early modern English the main exception to this rule was the verb to have, e.g. "Have you a match?" and this form persists in many set phrases and more conservative and literary varieties of English. In modern spoken British English the possessive aspect of have is often emphasised with got (e.g. have you got a match?), while in American English the form "Do you have a match?" is more common. Both constructs ensure a regular SVO order in both statements and questions. The verb to be may seem an exception, but as an intransitive verb it never has a direct object, only a subject and a subjective complement e.g. Is John a farmer? Doesn't need another auxiliary verb to remain unambiguous.
  5. English prefers possessive pronouns rather than the reflexive or dative possessor constructs common to most continental languages, e.g. "I washed my hands" translates "Ich habe mir die Hände gewaschen" or "Je me suis lavé les mains". English also tends to specify possession much more often than other languages, e.g. "I rode my bike" is more colloquial in most contexts than "I rode the bike" (which would usually mean "I rode a previously specified bike"), but one could also say "I rode my sister's bike". The Anglo-Saxon genitive is firmly ingrained in colloquial usage, Martha's bag rather than the bag of Martha (not usual in native spoken English), as it maintains the expected word order with modifiers preceding nouns.
  6. English stresses the distinctions between definite and indefinite objects as well as between known and unknown quantities. Thus "the women" refers to a specific subset of womankind identified earlier in the discourse, while "women" without a determiner refer to adult females in general. Likewise "the wine", "some wine" and just "wine" refer to different degrees of specificity, "I like wine" means "I like wine in general", while "I like the wine" may mean "I like the wine you just poured into my glass" and "I'd like some wine" just means I would appreciate an unspecified modest quantity of wine", but does not carry the same emphasis as literal equivalents in other languages although it is broadly comparable to the French du vin or Italian del vino. In other European languages these distinctions tend not to be so important. French, Dutch, German, Italian and Spanish all have much stronger colloquial tendencies just to use the definite article, so "les femmes" may be both specific and generic.

The above aberrations from the continental European norm would suggest the enduring influence of substratum languages on the evolution of spoken English.

Neurolinguistic Programming

Psychologists have long been aware that the words and phrases we choose to express ideas can affect someone's willingness to believe a message, follow an order or internalise a new concept. Large organisations invest billions in the art of gentle persuasion, not just in advertising, but in public relations, awareness raising and increasingly in management via neurolinguistic programming (NLP). If your boss calls you to her office for a wee chat, you may reasonably wonder what she wants from you. Is she about to sack you, ask you to work overtime or offer you a pay rise to stop you leaving? Rest assured that most modern human resources managers have learned not only how to impart unwelcome news, but how to deal with awkward employees, who do not take kindly to management bullshit, e.g. an HR manager may engage in polite conversation about your children's progress at school and your last summer holidays and then thank you for your hard work over the last year, but none of this matters if the whole purpose of the meeting is to inform you of the termination of your employment at the company. If the HR manager had just said: "Hello, Mr Jones, you're fired", the gist of the conversation would be the same.

Neurolinguistic programming is about much more than marketing. It serves to reframe common events and concepts in a way that suits the interests of the managerial classes, in short persuasion. When you hear middle managers and politicians claim they did not get the message across to their target audience of employees, consumers or voters, it's an implicit admission that their NLP techniques failed, not that their policies are wrong.

Although the mismatch between English spelling and pronunciation presents a challenge to many learners, the language has proven very versatile in adopting new ways to express common concepts without having to alter its core grammar or syntax. Yet when the transgender lobby wanted to instil in the public mind the idea that gender may be non-binary, they devised a new set of pronouns as alternatives to he/him/his or she/her/hers. One may now be known as zhe/zher/zhers or they/them/theirs. Spoken English has long used the second person plural when the sex of an abstract person is unknown especially when combined with someone e.g. "Someone has left their phone on the table", but a sentence like "Kim gave me their key" would be ambiguous in English. Indeed Canada has enforced the use of confusing gender pronouns via its controversial C16 bill. Professor Jordan Petersons correctly defined this as imposed speech, going against the Anglo-Saxon tradition of a free market of new terms which are voluntarily adopted, albeit, I may add, with a little help from the advertising industry. Should a central committee decide which pronouns we use to describe other people in our social group any more than it should adjudicate on the correct term for tablet computer (something most people call an iPad).

Many organisations now employ copy-editors whose remit extends way beyond correcting typos or amending grammar, syntax and style to focus the core meaning and narrative that language conveys and to expunge all politically incorrect references. For instance, the British Foreign Office urged the UN Human Rights Committee to change the term pregnant woman to pregnant person. Until recently, nobody would have been offended by the assumption that only women can become pregnant, a concept deeply entrenched in most naturally evolving human languages. In the early 21st century not only do some biological females identify as men or non-binary, but artificial wombs may one day enable biological males to experience motherhood. However, simply swapping old gender-specific terms for new neutral terms can produce some very hackneyed phraseology. New concepts, such as the normalisation of fertility treatment as a common means of procreation, have to be promoted via special interest news stories and celebrity endorsements, before the public at large can readily accept them. As Diane Ravitch detailed in her 2004 book, the Language Police, textbooks are being rewritten to reflect the postmodern obsession with political correctness that effectively closes our minds to a wide field of legitimate scientific inquiry.

One thing is for sure, while the diversity of spoken tongues is shrinking, language is evolving at an unprecedented rate both to reflect our rapidly changing human ecosystem and to alter our perception of reality.

Categories
All in the Mind Power Dynamics

Anatomy of a Rebel

We may like to think of people as progressive or conservative, collectivist or individualist, egalitarian or meritocratic, caring or competitive, libertarian or authoritarian, selfless or selfish, nature-loving or materialist. All too often we simplify these issues along an arbitrary left to right spectrum, usually with the more virtuous stances on the left. However, one criterion sets us apart from the crowd, dissent. What kind of people will go against the flow and challenge contemporary orthodoxy out of personal conviction risking social opprobrium?

In the 19th century the prevailing doctrinal system across much of Western Europe preached love of God, monarch and country, moral superiority of European civilisations, traditional two parent families and a rigid class system in which everyone knew their place. Critical thinkers would naturally look to alternatives that challenged the hegemony of the old aristocracy, the clergy and the emerging capitalist classes in pursuit of greater freedom, independence, morality or social justice. In short people rebel because they are dissatisfied with the current system and envision a better world for themselves, their loved ones or for wider society, which they see threatened by vested interests. Likewise, people conform to gain favour with the managerial classes and win the trust of their neighbours and colleagues around shared allegiances.

A rebel in 1960's North America may have opposed the worst excesses of capitalism with its unbridled cut-throat competition and its promotion of wasteful mass consumerism. By contrast a rebel in the Soviet Union of the same era would oppose the state repression of personal liberties, censorship, pervasive surveillance and the extreme concentration of power in the party machine. While we may place one rebel on the left and the other on the right, they may well have been striving for the same fundamental human values that seek to marry personal freedom with social responsibility. Soviet-era propaganda would routinely portray dissidents like Aleksandr Solzhenitsyn as neo-fascists or dangerous reactionaries eager to unravel the great progressive gains of the workers' revolution. By the same token Western rebels before the precipitous fall of the Soviet Union were often portrayed as communists who would threaten our cherished family life, Christian values, democracy or free market economy.

Anarcho-communists and Metro-elitists against the Traditional Working Classes

In country after country we're witnessing a rather odd spectacle. Social conservatives with a strong belief in two-parent families, nation states and cultural traditions have become the new nonconformists, rejecting the prevailing mantra of endless progressive social engineering. If you support gay marriage, open borders, cultural homogenisation and the re-categorisation of humanity into competing victim groups, you will enjoy the whole-hearted support of much of the mainstream media, academia, many well-funded NGOs, big business and many large governments. For decades the growing leisure sector has openly promoted a carefree lifestyle of boundless explorative indulgence. Two recent referenda in Ireland reveal the emerging divide between universalist conformists, easily swayed by celebrity opinion leaders and subliminal media conditioning, on the one hand and traditionalist nonconformists on the other. In both the gay marriage and abortion consultations, organisations such as Amnesty International and the Open Society Foundation joined the main political parties, the Irish Times and many high-profile celebrities such as Bono and former President Mary McAleese to push for change. Only ten to fifteen years earlier the Irish people would have rejected both referendum propositions.

On the other side of the big pond the intellectual gulf lies between conservative rednecks and liberal professionals with their large fan base of special interest groups dependent either on welfare largesse or beneficiaries of the postmodern lifestyle revolution. North American terminology often confuses outsiders. While American liberals may have once advocated less state interference into people's personal lives and championed small businesses and free speech, today they invariably advocate greater state involvement in every aspect of our lives presumably to tackle the scourges of social isolation, discrimination, mental ill-health and manage the complexities of rapid cultural change and apparent hyperdiversity, empowering state and corporate actors to monitor the masses for their own good. Indian-American author Dinesh D'Souza has coined a metaphor for the transformation of the American Democratic Party, from supporters of slavery, racial segregation and the infamous Klu Klux Klan, to the champion of all purportedly disadvantaged victim groups. Whereas 19th century Democrat politicians wanted to confine African Americans to the rural plantations, dependent on the benevolence of their slave masters, it now relies increasingly on votes from denizens of the urban welfare plantations, dependent on state handouts. 150 years ago farmers and manufacturers still needed plenty of cheap manual labour. Today they need loyal consumers more than conscientious workers.

We have progressed from an age when the authorities treated homosexuality as a mental disorder, often prescribing hormone treatment to suppress undesirable erotic urges, to an age when teachers, social workers and medical professions collude to indulge transgender fantasies in young children, often prescribing hormones to suppress natural puberty. Whereas once sexual deviants may have run foul of the law, today parents and carers who adhere to traditional family values may attract the ire of busy-body social workers and even have their children removed.

Meanwhile in old Blighty we see the Guardian-reading professional classes take to the streets to express their support for the European superstate and their distaste for the maverick US President, who seems too keen on enforcing border controls and not keen enough on military adventurism. Europe is inconceivable without France, but just 15 months after reluctantly opting for establishment wonder boy, Emanuel Macron, in a run-off with the much-maligned nationalist candidate, Marine Le Pen, the French have had enough of more global convergence. The yellow vests, or les gilets jaunes, represent the grievances of the squeezed provincial working classes and small business owners, most affected by higher fuel duties, extreme labour mobility, outsourcing and smart automation. Recent socio-economic trends have had two main sets of perceived beneficiaries: the affluent professional classes and a growing array of welfare-dependent victim groups, who have acquired a sense of entitlement denied to previous generations, who before the expansion of their modern welfare state had either to earn their keep or appeal to the generosity of their extended family. Combined these groups still form a minority of the general population. While artificial intelligence may see the professional classes (currently around 15-20%) shrink further, the welfare classes are growing across Western and Northern Europe (anywhere between 15 and 25%). The squeezed middle of normal hard-working families, struggling to make ends meet, have become a little inconvenient for social policy planners as they tend to have conservative views on most contemporary controversies, i.e. wanting to conserve the viable society that helped millions of ordinary people earn enough to marry, start a family, afford a house and buy a car to entertain the illusion of personal independence. Most citizens were happy for the state to offer a helping hand when they fell on bad times, but did not want the state to run their lives, raise their children or eavesdrop on their private conversations. The public sector should serve the interests of the people and not vice versa. However, today sociologists, and many politicians, talk increasingly of communities rather the people, as the fast pace of demographic change, migratory flows and labour market fluidity has destabilised traditional rooted communities and replaced them with transient communities of disparate special interest groups, which may be as diverse as single mothers, gays, lesbians, Muslims, West Africans, Chinese, sufferers of mental illnesses, online gamers or Python programmers. We now identify people more by their behaviour than by their family or ethnicity.

The cosmopolitan professional elites and rooted masses have two conflicting worldviews. The former views grievances and civil unrest as social policy challenges that require more proactive intervention and outreach groups to engineer a more harmonious social reality by reconciling the divergent interests of our new intersectional communities. They see themselves helping other people adapt to globalisation and rapid cultural change rather than trying to preserve their former way of life. In short, the progressive managerial classes view the rest of us as overgrown children who must learn to play together without fighting or bullying.

By contrast advocates of nation states, still the vast majority of Europeans, view citizens as the architects of their common social landscape who agree on shared values and participate actively in their geographic community, i.e. a country is what its people make of it. Naturally some communities may have radically divergent cultural practices that impair social cohesion. To resolve such conflicts, we may either confine some activities to private properties or designated public zones, or seek greater regional autonomy to manage affairs more in tune with the wishes of local residents. Europe's largest nation states evolved after a lengthy process of cultural convergence largely along linguistic and religious lines. Multi-ethnic empires such as the defunct Austro-Hungarian Empire or the Polish-Lithuanian Commonwealth could survive in feudal times as convenient alliances of fiefdoms and royal dynasties, but few countries could nurture liberal democratic institutions without a strong sense of shared identity and usually a common language. Belgium and Switzerland have finely tuned federal systems to accommodate multiple national languages, while Spain has granted Catalonia considerable autonomy over language policy. Large multilingual federations as diverse as India, Nigeria or South Africa struggle to build a unified identity around an administrative language only spoken proficiently by the managerial classes. To this day the native peoples of Europe retain a strong sense of shared national identity and history, supplemented only by new universal behavioural identities and postmodern universalist values, but such parochial feelings are much weaker among the professional classes and young adults immersed in a world of pop culture and easy travel. As natives are now distinct minorities in many Western European towns and cities (e.g. in only 3 of London's 32 boroughs are a majority of primary school children classed as White British), we can only expect further weakening of shared nationhood.

However, we live in an era of shifting alliances. In France the latter-day Trotskyists of Jean-Luc Mélenchon's La France Insoumise make common cause with socially conservative lorry drivers, small business owners and farmers. Some on the left still remember the days when we supported workers' strugglers against outsourcing and imported agency workers. Some old school trade unionists still realise the workers' struggle needs a united working class able to disempower their bosses through targeted industrial action. Globalisation has severely weakened the bargaining power of European workers. If they strike, their manufacturing facilities will simply be relocated, automated or operated by a new team of temporary labourers. The descendants of the old syndicalist left have failed to reconcile their universalist ideals of international solidarity and equality for all disadvantaged groups with the practical needs of today's core working classes who struggle to compete in a dynamic labour market with an endless supply of transient human resources at the bottom end of the salary scale and forever higher levels of expertise required at the top end. What's worse lucrative careers demand extreme specialisation with extraordinary personal qualities. Conscientiousness, or as we often call it today a can-do attitude, no longer suffices, leaving many redundant workers with a bleak choice between competing at the bottom end of the labour market for breadcrumbs and learning new intellectually challenging skills to outwit the best and brightest university graduates. Not surprisingly, many just give up and join the welfare classes. In 1980s Britain unsuccessful young adults would often blame Thatcherism for their misfortunes, but the old manufacturing and mining jobs that employed millions of workers are not coming back as robots take over. Today's Labour Party would like us to blame the Tories for cutting public spending. Yet such cuts are an illusion as government spending continues to rise year on year. The paternalist left would have us believe that minor adjustments to welfare provision, namely in a British context the roll-out of the new universal credit system, are fuelling the growth of foodbanks and homelessness in a country whose primary causes of premature death are all related to obesity and/or junk food and whose housing crisis is only exacerbated by unbalanced migratory flows, which they dare not criticise.

Luxury Communism

To fully understand the transition of the mainstream left from rebels to establishment cheerleaders, one need look no further than Aaron Bastani's new book about our emerging technotopia bankrolled by the world's leading tech giants. Would it be too far-fetched if a future worldwide government took the likes Amazon, Huweii, Samsung, Microsoft, Google and Apple into public ownership and proceeded to redistribute their massive profits as universal basic income? As Chinese industries begin to invest billions of Yuan in intelligent robotics, the sleeping giant is poised to become the world's largest consumer market with the government rolling out a social credit system to reward its citizens not for their hard work, but for their compliance. Currently, social credits entitle well-behaved citizens to discounts, easier Just Spend loans and travel passes, but it doesn't take a huge leap of faith to imagine that one day such a system could form the basis of universal basic income. Your basic income would be supplemented by rewards if you acted as a model citizen proselytising preferred lifestyle choices and cultural outlooks. While it may seem fair to reward you for taking good care of your health through regular exercise and a wise diet to minimise your burden on the public healthcare service, you may not be so pleased about the state's undue interest in your mental health, whose definition now extends to your political and moral views. Only last week Humberside police questioned a man who retweeted a transgender limerick, which they flagged as a hate incident, after a serious of social media message critical of gender theory. Now imagine having your UBI cut because you failed to attend a gay pride event, expressed your disagreement with euthanasia (already legal in the Netherlands, Belgium and Switzerland) or just failed to cooperate fully with the government's social engineering initiatives. Bastani may envisage our idyllic future as a large holiday resort interspersed with parks, playgrounds, sports centres, dance halls, libraries, cafés and canteens where highly educated professionals only work a few hours a week. The trouble is only a tiny fraction of the general population will understand the complex technologies that make such a world possible or be fully aware of the advanced people management techniques required to maintain the illusion of social tranquillity.

Agitating against Wrongthink

Back in the day fascists were autocrats who did not trust the people at large to participate fully in open debate about how to run their society. From a fascist perspective, benevolent dictators may occasionally consult the people via stage-managed plebiscites, but only the upper echelons of the managerial classes can be trusted with the administration of our collective infrastructure and organs of indoctrination and supervision. In this regard, Mussolini's Italy, Franco's Spain or Salazar's Portugal had much in common with Stalin's Soviet Union, except the latter aspired to worldwide socialism while often appealing to pan-Slavism and Russian nationalism. Not only did Mussolini start his political career as a socialist and as editor of the Italian Socialist Party's newspaper Avanti, but his fascist government pioneered the role of state intervention to accelerate industrial growth in a kind of public-private partnership known at the time as corporativismo.

Today many on the universalist left accuse anyone opposed to corporate globalisation of, wait for it, fascism. That's right. Fascists used to be corporate authoritarians, while today it's the opponents of corporate hegemony and cultural convergence who get labelled perversely as fascist. Even more perversely free speech is now tarnished by its association with the so-called far right. In practice that means we can longer freely discuss multifaceted issues such as migration, surveillance, sex education in primary schools or censorship without being accused of racism, terrorism, homophobia or hatred. Just as we asserted the right to intellectual freedom in the 1960s and 70s in the name of social progress, many left-leaning social justice warriors now spend much of their time campaigning to censor socially conservative viewpoints. They have become the new arbiters of politically incorrect thought every bit as bad as Mussolini's Ministry of Popular Culture (Ministero della Cultura Popolare) or the infamous East German Stasi (Ministerium für Staatssicherheit). They are not rebels, but enforcers.

True rebels challenge the powers that be and without the freedom to criticise orthodoxy we will slide ineluctably into authoritarianism, albeit of a high-tech variety.

Categories
All in the Mind Power Dynamics

Guilt by Association

Screaming at a computer

Recently British author Douglas Murray took part in a video chat with the renowned YouTube sensation and self-proclaimed libertarian philosopher Stefan Molyneux. While critical of Radical Islam and mass migration, Douglas Murray has been careful to steer a middle ground. Initially he came across to me as a Blairite, not least because he's associate director of the Henry Jackson Society, a neoconservative think tank that has counted Labour MPs such as Ben Bradshaw and Jim Murphy in its ranks. I wonder how they reconcile their differences over the EU. As a supporter of the 2003 Iraq War, Douglas Murray has earned himself plenty of airtime on the BBC. I suspect his recent congenial conversation with Stefan Molyneux will soon catapult Mr Murray to the outer reaches of the cybersphere, seldom to be seen or heard again on mainstream TV or other approved channels of official news.

Stefan Molyneux openly believes that not only do genetic variations between racial groups affect our intelligence, but such differences are significant and irreconcilable. I don't have time to do justice to this debate because the subject both fascinates and disturbs me as I'm a little uncomfortable with some of the conclusions of leading social biologists like Charles Murray or Nobel Laureate and co-discoverer of DNA James Watson. Not surprisingly, Mr Molyneux has attracted a large following from what we might fairly call racists and I pick my words carefully. In my mind a racist is not someone who is simply proud of their racial lineage or prefers to mingle with others of the same ethnic background, for most Africans, Chinese and Indians would fall into that category. A true racist believes that their perceived intellectual superiority grants them special rights over other racial groups. Apartheid South Africa before 1994 and Zimbabwe as Rhodesia under Ian Smith before 1980 are classic examples of openly racist states that survived into the late 20th century. Their rulers often cited IQ test results to justify excluding most black citizens from the levers of power. Perversely the self-proclaimed liberal intelligentsia now accuses the native working classes of ignorance whenever they fail to endorse their preferred policy options. Ever so subtly both the BBC and Guardian have blamed a lack of education for the unexpected outcomes of the last US Presidential Election or last year's EU referendum. We read terms such as low-information voters, which is a codeword for low-IQ voters unable to interpret conflicting sources of information.

On Wednesday I awoke to the news that Donald Trump had retweeted three videos originally sent by Jayda Fransen, of Britain First, whose main focus is on the rapid Islamisation of parts of urban Britain and the suppression white British identity. Paradoxically the group's leaders often cite Winston Churchill's forthright warmongering against Nazi Germany in the mid 1930s to justify their stance against Islamic supremacism. To be honest, despite living in London for many years with a short period in Leeds, I've only ever seen Britain First online. Jayda Fransen only came to my attention in a series of videos about the town of my high school years, Luton. As far as I know Britain First is a splinter group from the much larger English Defence League, which attracted many football supporters of different backgrounds. Today in Luton the real divide is over allegiance to Islam, not race or ethnic background. The EDL, UKIP and now its splinter organisation, For Britain are all very supportive of Israel and have many members of African, Asian or mixed racial heritage. The Glaswegian anti-Islam activist Shazia Hobbs, of half Pakistani descent, comes to mind. More notably Breitbart columnist, former UKIP leadership candidate and author of No-Go Zones is one Raheem Kassam, who grew up as a Muslim. The main thread that unites these disparate groups is their aversion to Islamic expansionism and their uncritical support for the Jewish State of Israel. Many naive leftists, and I include my younger self in this category, believe in a simplistic black and white world of affluent white imperialists and poor oppressed dark-skinned people. It seemed to make sense in the 19th century, when most wealth was concentrated in Europe and North America and Western governments treated their colonial subjects as second class citizens. Now the same multinationals that once supported British, American, French, Dutch or German imperialism have shifted their support to globalisation. They are not interested in spreading the cultural heritage of the countries that nurtured technological innovation or in granting their working classes any special privileges. They only need an elite of engineers, scientists and managers trained in psychology and neurolinguistic programming to keep their industrial operations afloat. Everyone else is expendable, useful mainly as consumers who earn crumbs from menial jobs that can be automated or from their obedience to an interlocking network of welfare providers.

However, the marginalisation of the native white working class has not succeeded in silencing dissent, merely in disrupting rational debate about how we should deal with an unprecedented rate of cultural change or even if such changes are desirable or a price worth paying for short-term economic growth. So Brendan O'Neill hit the nail on the head in his recent blog post on the Britain First Retweeting Scandal. This fringe organisation is indeed a monster of the establishment's own making. Our soi-disant liberal opinion-leaders have demonised a large cross section of ordinary decent British citizens, who through no fault of their own happen to descend from a long line of Northwest Europeans who settled in these isles, for the crime of wishing to protect what's left of their cultural heritage in a world of permanent uncertainty. I think a narrow focus on Islam is misplaced or rather its growth and the ensuing culture clash are symptoms rather than causes of a greater malaise. Would radical Islam pose such a threat if our rulers had not destabilised the Middle East and had not allowed the creation of parallel communities in towns and cities which until recently were boringly monocultural with only a trickle of immigrants who had little choice but to assimilate? Can we not at least discuss the causes of the expansion of radical Islam ? Is it fair for working class Europeans to accommodate more Muslim migrants because of the latter group's much higher birth rate? Ironically the very mention of demographics and environmental sustainability annoys both the Christian Right and Islamic fundamentalists, for both believe in large families despite a dramatic drop in infant mortality. The population of Africa and the Middle East is not rising so fast because women are having more babies, but because more babies survive thanks to modern medicine and better sanitation. Subsistence farming can no longer sustain such large populations, leading to a massive oversupply of migrant labourers and beggars in the burgeoning metropolises of the developing world. In such an environment it is easy to see the appeal of radical universalist ideologies that promise welfare for all in exchange for doctrinal subservience. This explains the seemingly odd alliance between the Marxist Left and Radical Islam. They both ultimately lead to an extreme concentration power in state or corporate institutions. What is the point of feminism or the empowerment of women, if all but the most privileged citizens, of either gender, have to submit to the will of higher authorities governing every aspect of human behaviour? I would hope we could have an honest debate on this subject without resorting to unnecessary shaming through guilt by association or pointless virtue-signalling.

https://www.youtube.com/embed/eAsDYc6vR5A

Douglas Murray discusses the concept of guilt by association with American philospher and neuroscientist Sam Harris.

Categories
Power Dynamics

NewSpeak for the Masses

Pidgin English

Over the last 10 years I have slowly but surely convinced myself the pace of cultural change is accelerating at an unprecedented rate. The only certainty now seems uncertainty itself. Scenarios we deemed unthinkable just five years ago now seem emerging realities. We're all in a way prisoners of our age and may easily lose sight of the wider historical context. As civilisations wax and wane, we'd be foolish to assume our current way of life will continue to progress along the same lines.

Out of the blue I learn the venerable BBC is spending a fair chunk of taxpayers' money to spread its message not in standard English as understood in Britain, the USA, Canada and many other countries, but in West African Pidgin. Apparently Naija or Nigerian Pidgin is now the primary street lingo of over 70 million people in a country whose population is projected to grow from around 190 million today to over 400 million by 2050, unless the fertility rate declines significantly. The BBC has moved its focus away from Europe, though it struggles to hide its clear bias in favour of the EU project. It has admittedly, and perhaps laudably, also expanded its coverage to Hausa, Yoruba and Igbo, Nigeria's three most widely spoken native languages, but I guess its main focus is on the mushrooming urban areas that will grow into Africa's megacities of the mid 21st century. Here Broken reigns supreme. This adjectival noun is the slang term for variants of international English that diverge syntactically, phonologically and semantically from standard Nigerian English, the lingua franca of the country's affluent urban elite.

On the surface one may wonder why the BBC is bothering at all as most educated Nigerians with easy Internet access would much prefer standard English, the main language of their country's press, schools and commerce. However, I suspect our intrepid global media organisation may have a longer-term view. Have the BBC spotted a trend or are they helping to set a trend? Have they merely acknowledged that standard English is giving way to Pidgin in West Africa or are they elevating a street vernacular to the status of urban lingua franca that could serve as a blueprint for a future global sociolect that could merge with other English-based urban dialects around the world in an era of mass migration ? A fluid and rapidly evolving street lingo could also be a good candidate for a future NewSpeak, stripped of the semantic range and subtleties of academic English or other well-established literary languages. Another factor, which I suspect many have overlooked, is the social acceptance or coolness factor of a dialect. Formal variants of languages are seldom cool. They may be the route to academic success and career progression, but can isolate their elitist speakers from the plebeians who have infused their vernacular koiné with demotic phraseology and pronunciation. Likewise the urban masses view native tongues as uncool remnants of their rural past. We may compare global academic English with the posh Received Pronunciation that the BBC used to impose on its reporters. What West African Pidgin may lack in official recognition, it certainly gains in street cred.

Many of us take language for granted. We tend to assume that the current linguistic status quo will only gradually evolve from generation to generation. Back in late late 80s and early 90s as a language teacher and later as a technical translator in Italy, I witnessed and took part in many debates on the growing role of the English language in continental Europe. I sympathised for a while with Robert Phillipson's concept of Linguistic Imperialism and felt perhaps native English speakers enjoyed unfair advantages, but the prime movers behind the apparent Anglicisation of Europe were not native English speakers or their governments, but multinational companies and the emerging globally connected professional classes. In principle Esperanto would have been a much fairer choice for international communication as it would put everyone on a level playing field and has an easy pronunciation with phonetic spelling . Perhaps in a parallel universe where English did not spread far beyond its home island and no other major language filled the void, we may have opted for a neutral artificial lingua franca, but it may well be human nature to gravitate towards the most prestigious and influential modes of communication. While the expansion of English as a universal lingua franca seemed unstoppable, the other main languages of Europe, the Americas and Asia did not disappear. Indeed while French and German may be in relative decline outside their home regions, Spanish and Portuguese have retained their dominance in Latin America. But since the 1990s three big trends have transformed our linguistic landscape.

  1. First the Internet has enabled seamless communication with anywhere else in the world. Geography is no longer a barrier. That helps non-native learners of English to immerse themselves in English-medium interactions without leaving their office or home, but also lets new migrant communities isolate themselves from their adopted countries.

  2. Second the world is literally on the move as never before. People either migrate away from close-knit rural communities where traditional languages prevail to bustling multicultural metropolises or to wealthier countries with better employment opportunities and more advanced welfare provision.

  3. Third the English language itself is evolving at an unprecedented rate even among its native speakers, mainly due to rapid technological and cultural transformation, but also as a result of strong ideological pressures to create new concepts and suppress older categories.

As the ultimate vehicle of cultural expression, language serves four main functions:

  • Communication
  • abstraction of complex concepts
  • social bonding
  • and cultural identity.

We may think of communication as the mere exchange of facts and instructions, but more often than not it relies on a learned cultural context and a willingness to share strategic intelligence and inner thoughts. Some psychologists, such as Albert Mehrabian in his 1981 book Silent Messages, have suggested most casual communication is non-verbal, i.e. implied via body language, tone of voice, choice of terminology and secondary semantic associations of key words and phrases. Suppose you want to order a drink. The customer and bartender only really need to exchange three pieces of information, an unambiguous description of the desired beverage, its availability and price. Indeed the whole exchange can proceed without a common spoken language. A price list with pictures and international brand names can suffice. However, commercial communication is about much more than just enquiring about the availability and price of an item. It's about establishing a rapport with customers, by learning their predilections and building a network of regular visitors, who not only enjoy your food and drinks, but also your hospitality, your wit and your venue's ambiance. That often means not only sharing a common spoken language, but also cultural attitudes. While the practical aims of knowledge sharing and the expression of complex ideas tend to favour languages with far-reaching prestige and solid literary traditions, the demands of social bonding and cultural identity often lead to fragmentation as speakers adapt to speech patterns transmitted as substrata of earlier indigenous languages. People tend to associate affected standardised speech, as learned at school, with the privileged classes. In Africa's teeming conurbations the linguistic contrast is not so much between colonial European languages and native tongues, but between rival sociolects of what may deceptively seem the same lingua franca.

The continued evolution of global English raises two key questions in enquiring minds. First is how long will what we may loosely call Anglo-American English remain the standard bearer of global communication? This means maintaining full mutual intelligibility with the language commonly spoken by educated North Americans, Britons and Australians with all its idiosyncracies. Many observers have viewed the last century of linguistic convergence as a gradual shift towards the universal adoption of Mid-Atlantic English, relegating the esteemed national languages of Europe and Asia to the status of local dialects, a bit like Welsh today. Some argue that as long as these idioms remain the native tongues of most local citizens, we need not fear their extinction as actively spoken community languages. However, as migratory pressures grow, that's already beginning to change in many European cities where natives have become an ethnic minority.

The second big linguistic issue of our age is the divergence of linguistic registers for radically different social classes. Most academic and scientific publications, usually rich in jargon and poor in prose, are incomprehensible to most non-technical readers. Phonology and morphology may be of great interest to linguists and language learners alike, but what really matters is the range of concepts a language can express. Many conceptual nuances are deeply engrained in our collective psyche transmitted as memes from one generation to the next. By abandoning the languages of our ancestors, we lose touch with our past and adopt a volatile vehicle of communication we cannot truly own as we struggle to adapt to its rapidly changing vocabulary and semantics.

Languages can evolve both to expand and restrict the range of permissible thoughts. Over the last century English has shown a strong tendency towards simplification and abbreviation, often at the cost of clarity and exactitude. However, academic writers can always resort to more conservative Greco-Latin terms to avoid the semantic connotations of more common words. To complicate matters, many previously innocent descriptors have fallen out of favour with the language police as they may reveal unfashionable prejudices and cultural assumptions. We've gone a long way from discussing whether African Americans mind being labelled as black or niggers. Now we debate whether we should rename father's day to special person's day to avoid offending fatherless children. North American students have recently started advertising their preferred pronouns to avoid offending the tiny minority of transsexuals who do not identify as either female or male. Yet cultural revolutionaries are not content with redefining traditional racial, ethnic and sexual categories. They've turned their focus to the realm of psychoanalysis and psychiatry. A person can now be defined by an endless array of arbitrary behavioural patterns. While once school children may have struggled with their parents' cultural identities such as Protestant or Catholic, now they may identify as sufferers of ADHD, social anxiety, depression or gender dysphoria to name but a few new-fangled labels that modern teachers use to describe their students. Psychobabble began to make inroads in everyday English back in the 1980s, but has now transformed the way we relate to neighbours and colleagues.

Our linguistic future depends not only on the rise and fall of regional superpowers or the development of real-time translation applications, but also on our evolution as a species. Will we all join the professional classes and play equal roles in shaping our shared future? Such a utopian vision seems improbable if we contemplate the growing intellectual rift between today's affluent professional classes and the much more numerous underclasses whose monotonous jobs are being automated. If we add CRISPR gene editing technology and cognitive enhancement via microchip implants to the mix, humanity may well diverge into subspecies with a much wider range of intellectual performance than today. Would these disparate groups need to speak the same language and more important would the intellectual elites want the underclasses to understand their deliberations at all ?

Pidgin as a Future Global Sociolect

Anti-imperialists have long had an ambivalent attitude to the languages of their former colonial masters. Different countries have taken different approaches. Some have successfully adopted an indigenous lingua franca where it has gained considerable prestige or is deeply embedded in local culture through literature and religion. Tanzania may have a hundred or more local languages, but Swahili, taught in all primary schools, remains the main link language between communities while English is reserved for secondary and higher education. By contrast English has long been the primary medium of instruction in post-independence Nigerian schools. However, students learn a distinctively Nigerian version of the language as teachers alternate between standard English, Pidgin and snippets from native African tongues. To the untrained ear it can be hard to discern where one language begins and another ends.

Pedagogues have long debated whether children learn best through the medium of their native language if they will later have to learn another more prestigious language anyway to stand any chance in the job market or even be able to communicate at all if they move to the nearest city. Many studies have shown that where a native tongue still prevails in the local community students can progress faster in many core subjects through a standardised version of the language they speak at home because they can grasp new concepts better through semantic associations they have internalised passively through everyday social interactions and play outside school. Alas in practice teachers may not speak the same dialect as their pupils, who may not all share the same mother tongue. Native language teaching works in stable communities, but not in dynamic urban settings. As a result the newly enforced lingua franca is effectively dumbed down to a level that facilitates communication between students and teachers. Modern teaching methods in mainstream state schools emphasise inclusiveness and self-esteem rather than excellence and rigour. Not surprisingly the Nigerian business classes tend to send their children to private schools with smaller class sizes and better educated teachers. However, most West African children attend state schools with average class sizes of 50 to 60 and poorly paid teachers who impart their own demotic variant of the lingua franca. Peer pressure clearly favours Pidgin, while standard English serves mainly as a written language outside classrooms.

Nigerian Pidgin is of special interest to sociologists and linguists alike because its dialects form a continuum from informal Standard Nigerian English through urban sociolects to local vernaculars more heavily influenced by native tongues. Some have suggested that Naija could evolve into a more accessible version of pan-African English, discarding native English's impure vowels and trickier consonantal clusters and further simplifying its grammar.

Some aspects of Naija resemble the kind of street slang you may hear in London's multicultural schools where most pupils learn English as a second language. One of the quirkier rules of English grammar is the use of the present perfect tense (e.g. I have lived in England for six years) for an event that started the past, but is still ongoing. Most other languages use the present tense in these cases, e.g, I live in Lagos for six years (I've lived in Lagos for six years in 20th century native English). Until recently such usage would have appeared unnatural to most native speakers of English, but it's creeping into London English and few teachers make any effort to correct such aberrations being more concerned with spelling and classroom discipline. Cockney has all but died out. In its place new Londoners have adopted Multicultural London English, a melange of late 20th century estuary English, Jamaican and Pakistani English.

If mass migration continues at the current rate, and it may well accelerate, the emerging lingua franca of the world's interconnected megacities may resemble Naija more than any variety of native English. Could we be witnessing the withering of traditional ethnic languages and emergence of global sociolects in a new extreme form of diglossia or polyglossia? In this scenario the language we now call English would split into three sociolects with only partial mutual intelligibility. The planet's professional elites would converge on a tame version of North American English rich in Greco-Latin terminology and new technical jargon, but devoid of vernacularisms peculiar to native variants of English. Its syntax and pronunciation would gradually evolve in a similar way to Middle Latin jettisoning the quirkier aspects of Anglo-Saxon grammar. Meanwhile native varieties of English would lose their status as linguistic standards to which others should aspire, while urban Pidgin English would rise in prominence gaining hundreds of millions of new native speakers as Africa's population grows. At some stage Pidgin, as understood in Lagos, London and Mumbai may be more useful to globetrotters than the quaint Anglo-Saxon dialects of Iowa, Yorkshire or New Zealand.

Categories
All in the Mind Power Dynamics

Hackneyed Jargon and Intellectual Honesty

 ecker 12    1726

What I really mean by globalists, elitists and corporatists

A couple of weeks ago someone chastised me on Twitter for using the term Deep State to explain Donald Trump's Damascene conversion over Syrian regime change. Apparently the concept that the US Administration may be beholden to secretive cliques with close ties to the military industrial complex is a mere conspiracy theory perpetuated by Russian propagandists. All sane analysts know the US State Department has only ever supported the causes of liberal democracy and human rights abroad, if we exclude occasional strategic alliances with our enemy's enemies who turned out to be worse than our enemy. So by this logic General Dwight D Eisenhower was a mere conspiracy theorist at the height of the Cold War. Indeed most of the evidence I've encountered about the Deep State comes from Americans such as former Reagan Advisor Dr Paul Craig Roberts and Pulitzer Prize-winning journalist Seymour Hersh. Of course, the wheelers and dealers behind the Deep State deny its existence. They're merely exercising a little pressure on whoever happens to be in the White House.

I find it very hard to write about current macro-political developments without using the misunderstood adjective globalist or its related abstract nouns. I know it's hackneyed and many will dismiss my musings as those of a mad isolationist who simply wants to stop the world and return to a harsh primordial habitat. I guess globalism is a more of a philosophy, while globalisation is a phenomenon that results almost inevitably from rapid technological and economic changes. However, I cannot think of alternative terms that others would not misconstrue to an even greater extent. The real bone of contention here is not whether greater planetary interconnectedness is a good thing or not, but who is in control and for what purpose? Do we all need to adopt the same cultural paradigms and discard traditions that evolved gradually over hundreds of generations or can we harness recent technological advances to preserve the best of humanity's diverse cultures while allowing different peoples to experiment in new cultural expressions? Do we want a multipolar world with a mosaic of intersecting but socially cohesive communities or do we want a homogenised unipolar world?. In common usage globalist may refer to many things:

  • Corporate globalist A supporter of the hegemony of transnational corporations over national or regional organisations.
  • Political globalist A supporter of greater convergence of existing national and supranational governments. A political globalist may cite the phenomenon of corporate globalisation as a reason for the transfer of power from small nation states to larger regional blocs and only big organisations can counteract multinational businesses.
  • Global idealists Advocates of one world love free of all barriers that divide different groups of people. Such wishful thinkers imagine the whole world as a hippie commune and fail to see how breaking down one type of barrier, such as borders between countries, can lead to the erection of new barriers such as electrified fences around private properties when rapid cultural integration does not go as planned. Global idealists will often decry corporatists or mega-statists, especially when the global elites seek to transfer more power to greedy corporations and limit personal freedoms through greater surveillance.
  • Elitists favour a hierarchical society controlled by a small group of privileged individuals who consider themselves both morally and intellectually superior to the rest of humanity. Traditionally such people would favour nation states as the best means of preserving their power. However, today the globally connected rich prefer globalism to circumvent local democracies and expand their commercial empires. While a medieval elitist might want a principality to protect his castle, a postmodern elitist just buys an exclusive resort anywhere in the world as long as it's well protected and is easily accessible via private helicopter or yacht.
  • Internationalists, by contrast, advocate a multipolar world thats seeks to harmonise the practical needs of greater cooperation between communities and countries with people's desire for greater self-determination and gradual cultural evolution.

Few could doubt that the ability to communicate freely and instantly with anyone connected to the worldwide web is a good thing. It could help us learn from each other and resolve potential conflicts peacefully and amicably, as long as we respect that others may have very a different perspective. A true humanitarian does not seek to change other peoples, but learn from them, sharing knowledge and experience, but not imposing a new way of life. Some practices may seem vile or immoral, from our perspective. We may view the treatment of women and homosexuals in some majority Muslim countries with abhorrence. Many global idealists see it as their mission to liberate women and gays in these countries. Naive global idealism can easily yield to full support for military adventurism especially when justified by human rights concerns. However, a devout Muslim could by the same logic justify intervention in Western Europe to thwart the perceived evils of abortion, sexual promiscuity, stupefaction, gambling and usury. If you've ever tried to debate contentious topics such as abortion, you'll know what I mean. Pro-lifers will condemn pro-choicers as mass murderers, while the latter will denounce the former as religious zealots and apologists for misogyny and child abuse. In a multipolar world the citizens of one country could agree to ban abortion (except in cases of rape or where the mother's life in danger), while those in another country could allow it as the lesser of two evils. However, globalists would advocate a universal set of laws on such matters. If one can universally allow women's freedom of choice on abortion, one can also universally outlaw it, which may in practice lead many women to resort to shoddy backstreet clinics or dangerous abortion pills. The same logic applies to sexual mores. If we had a global referendum on the legal status of adult homosexuality or the death penalty, the outcome may shock Western liberals and recent demographic trends will only increase the number of people in ethno-religious communities that not only denounce homosexuality, but favour capital punishment. Mass migration, a phenomenon that globalists of all hues welcome, undermines traditional nation states, but creates new parallel communities with divergent cultural outlooks. To accommodate these communities, the authorities have to roll back the gains of the last three to four generations of social progress to a more laid-back and tolerant society. Communal tolerance only works with high levels of mutual trust and shared values. Until recently mixed gender social nudity was common in many locales in Scandinavia, Germany and the Netherlands. Now such venues have to be sectioned off to avoid conflicts with the countries' growing Muslim communities. The recently elected Austrian President, Alexander van der Bellen, suggested that all women should wear veils in solidarity with those who have to cover their heads and faces for religious reasons. The trouble with universalism is it all seems fine in theory if the world converges on the cultural expressions and practices that you favour. In the near future divergence from the universally enforced norm will be a privilege afforded only to the lucky few. Sir Richard Branson can carry on lecturing us on the wonders of globalism from the privacy of his own private island. I wonder how many refugees from Middle East war zones Sir Richard has welcomed onto Necker Island.

Categories
Power Dynamics

Should we still call the global lingua franca English?

In more innocent times we associated a language with its national community. For much of history nations and languages had a symbiotic relationship. Language is the ultimate vehicle of the cultural traits that hold together communities and build trust in institutions. A multilingual country is effectively an empire, for it has to unite peoples unable to communicate easily except through the medium of a common higher-register language that is not their own. In a simplified multipolar world, each country would have its own language and a set of shared customs, e.g. In Denmark one speaks Danish and in France one speaks French, both languages intimately bound to their motherlands. Admittedly French serves as a lingua franca in much of Northwestern and Central Africa and even Danish acts as a colonial language in Greenland. French is also spoken in Quebec, Walloon Belgium, Western Switzerland and a few French overseas territories dotted around the globe, but most native speakers live in metropolitan France. By contrast, only around 10-12% of native English speakers (L1 speakers in the Anglosphere and ex-pat communities) live in England itself. The ethnic English proportion may be a little higher if we include the greater diaspora in Canada, Australia and South Africa who still identify as English, but most native English speakers are North American and many more live in Australasia, Southern Africa and elsewhere.

It's hard to measure just how many people speak English worldwide as a second language. It could be as many as 3 billion if we include everyone who has learned some basic English at school or at work to as few as 500 million if we restrict the total to those who speak the language with a high degree of proficiency and, most important, retain full mutual intelligibility with native English speakers. Other estimates relate to varying degrees of fluency and may apply different criteria, e.g. the number of school leavers with a basic English language qualification or a random sample of the general population in which participants have to engage in long conversations with varying levels of difficulty (e.g. ranging from basics such as asking for directions to discussing more challenging topics such as politics). As a rule, English as a lingua franca is much more widely spoken in cosmopolitan cities and by members of the better educated professional classes. Whichever way, recent technological and cultural changes have vastly expanded our need to communicate with people from other language communities. Global English, for all its defects, not least its inconsistent pronunciation and orthography, has succeeded where Esperanto and a handful of other neutral artificial lingua francas failed. As the pace of globalisation and cultural change accelerates, the core of native and near-native English speakers will find themselves outnumbered by those who speak the language in wildly divergent and creative ways with little reference to the original variant of English that first migrated from the British Isles in the 17th century. Indeed it was not until the mid 19th century that English gained the upper hand over French, Spanish, Portuguese, Arabic, Russian or Chinese. Although France lost the Seven Year War in 1764, having to cede Quebec and most of its Indian territories to Great Britain, and its hopes of European supremacy were dashed at the 1815 Battle of Waterloo, French remained the preferred language of diplomacy and of greatest prestige in Europe well into the early 20th century.

English means different things to different people. To the English, it may still be a symbol of ethnic identity if spoken in its insular form with its odd colloquialisms and regional pronunciations. Today you will seldom hear the clever melange of Anglo-Saxon and Norman French that characterised Shakespeare's works, but rather a mishmash of vernacular British English, Americanisms and branded neologisms interspersed with politically correct NewSpeak and catch phrases popularised by TV personalities. The Scottish and Irish tend to have a more pragmatic view of the language, but take pride in their local dialects. To Nigerians or Indians, English is the high register of their commercial lingua franca. The subtleties of regional English dialects or latest suburban slang from Merseyside or Hampshire are of little interest to your average African or Asian business person, for whom English is a vehicle of communication and expression, but not a badge of tribal identity. To continental Europeans, English was, until recently, just another foreign language, but has now become a gateway to participation in the globally integrated business world, academia and youth culture, especially of the kind that global entertainment businesses most heavily promote. At times it seems everywhere global English trumps native languages, even where they remain strong. Yet to view this as a triumph of English culture over the rest of the world is in my humble judgement to misunderstand the far-reaching consequences of rapid global cultural convergence. Indeed traditional British English may well be a victim of its own apparent success, submerged by a rapidly morphing global lingua franca that owes as much as to Bangalore, Berlin and Beijing as it does to Birmingham, Brisbane and Boston. If a Briton from the 1950s could, through the magic of a time machine, experience the linguistic reality of modern Britain, she would be very confused. While superficially many common words would be much the same, many old terms and phrases have acquired new meanings or been superseded by more politically correct neologisms. Much discourse would be unintelligible without detailed knowledge of the last 50 years of technologically driven culture replete with brand names, acronyms and adapted foreign recipes. Back in the 1950s most Britons did not even have a phone or a television set, let alone an iPhone.

Opinions on the role of global English vary. Robert Phillipson has put forward the theory of linguistic imperialism, a must-read for anyone interested in cultural change. While I find many aspects of this perspective persuasive, especially in the context of cultural imperialism, in my experience abroad the key drivers behind linguistic homogenisation are not native English speakers at all, but international business. British imperialism and later US economic supremacy merely set the stage for English to expand way beyond its core of native speakers (still only 6.5% of the world's population). I find Jean-Paul Nerriere's concept of Globish, as popularised in 2009 book of the same name, much closer to the emerging linguistic reality, although I do not share his optimism that American and British English will retain their privileged status, which will wane with their relative economic and cultural decline. While I found much of the historical research in Nicholas Ostler's The Last Lingua Franca of great interest, I cannot support his conclusion that automated simultaneous translation technology will supplant the need for global English and let everyone cultivate their own vernacular. I've no doubt natural language processing will sooner or later let us translate human speech into a machine language intelligible to computers, but it will be some time before computers will be able to interpret the full range of nuances of colloquial human speech. Like it or not, cultural convergence is the order of the day, so now the French have to learn Globish while the Brits have had to discard feet, pounds and pints in favour of metric units.

I taught English as a foreign language for three years and soon learned English syntax had more exceptions than rules. As soon as I explained a rule, some wise spark would cite an exception, often from William Shakespeare, Jane Austen, Charles Dickens or whichever pre-20th century English authors happened to be on their reading list. However, the biggest stumbling blocks for my German and later Italian students were pronunciation, especially understanding authentic native speakers, and literal translations from their own language. In the pre-Internet era my best advice was to acquire English-medium movies with the original soundtrack subtitled in English. Most could read the language much better than they could speak it. If you attempt to read subtitles in your own language, you will miss the subtleties and flavour of the source tongue. At the time the received wisdom was that English is on the whole much easier than the other main European languages. The English-is-easy meme has become a self-reinforcing mantra, which in my experience as both a language learner and teacher is more attributable to its cultural ubiquity and prestige than to any intrinsic qualities. On the surface English grammar is very simple with no confusing grammatical genders (e.g, the Sun is masculine in Italian and French but feminine in German), a limited range of verb conjugations (I do, he does, I did etc.. as opposed to faccio, fai, fa, facciamo, fate, fanno, ho fatto, feci, facevo, farò etc..), only a few dozen common irregular verbs, very uniform plurals with a few exceptions, of course, undeclined adjectives and just a barebones case system. One wonders how Czech children can cope with seven grammatical cases and three grammatical genders, but they do. Indeed even old English had five cases and three genders, very similar to modern Icelandic or German. However, by this metric, the easiest language in the world must be Chinese, in which verbs, nouns and adjectives are never suffixed and relationships between words are either implied by word order and context or emphasised with helper words. Native English syntax is not as simple as many continental learners of the language would like to believe. Word order plays a much more important role in English than it does in languages with a more clearly defined case system like German or Polish. English has special interrogative auxiliary verbs to maintain its default Subject-Verb-Object (SVO) word order (e.g. When did you live in Italy? but How many people live in Venice?) and has a vast array of verbal tenses with auxiliary words (such as I do, am doing, will do, am going to do, have done, have been doing, did, was doing, used to do, had done, had been doing etc). While English's verbal moods serve useful semantic functions for native users, their utility is lost on speakers of other languages. In Germany, the Low Countries, France and Northern Italy, past actions are typically expressed with a tense we confusingly call the present perfect, e.g. I have done.. (j'ai fait.., ich habe .. gemacht, ho fatto .. etc.) while English always uses the simple past for terminated actions (e.g. I ate an apple five minutes ago, but I've never eaten a horse ). English distinguishes continuous from simple verbal forms, e.g. I drink tea (i.e. I'm a tea drinker), but I'm drinking orange juice (at the moment). In many other languages, the same verbal form would be used in both cases.

While English syntax may be a tad quirky, the biggest challenge for most learners is pronunciation. I once suggested the best international language would be written more or less like English, but pronounced as if it were Italian or Spanish. Naturally, some sounds are easier for speakers of some languages than others. Castillian Spanish, Greek and Arabic have the dental fricatives /θ/ and /ð/ as in theft or then, often a source of ridicule for French, German and Italian speakers of English. However, consonants only form the outer shell of syllables. Vowels and stress add colour to our speech and help us distinguish thousands of short words that would otherwise be homophones. Moreover, English vowels are notorious for their indistinctness. Most languages use variants of the 5 cardinal vowels /a/, /e/, /i/, /o/ and /u/, with a few diphthongs and possibly a few extra vowels. English, by contrast, has a complex system of short, long and gliding vowels that sit midway between cardinal vowels. The cardinal /a/ may be confused with North American rendition of the short o in hot, or Southern English version of the short u or /ʌ/ in hut or the long /a:/ in heart (r is usually suppressed in modern Southern English) or the Northern and Midlands English pronunciation of hat. In unstressed positions most vowels become either schwa /ə/ or a short i /ı/, e.g. comfortable may be phonetically transcribed as /'kəmfəɾtəbəɭ/ . Indeed if English adopted phonetically accurate spelling, many would confuse it for a quaint Scandinavian dialect with a few extra letters and diacritical symbols.

Speech patterns are learned in early childhood. Each dialect has a repertoire of sounds it must distinguish to facilitate communication. Our ears are fine-tuned to differentiate the phonemes particular to our linguistic environment. Exposure to other dialects enables us to remap these phonemes to other variants. As the pronunciation of English differs quite markedly from its spelling native speakers will often associate different sounds with the same written form. Over time common terms tend to be shortened, while ambiguous short words may need a companion word to emphasise their meaning or may give way to less ambiguous alternatives. E.g. the old English term wifeman became woman, Nonetheless, some languages tend towards abbreviation much more than others. In Italy the term scontrino fiscale amused me, why would shopkeepers have to keep reminding me that the small paper receipt that had just given me was for tax purposes? Many linguistic communities prefer more complete and semantically correct terminology for cultural reasons. If we had retained the Victorian attitude to word formation, many common English-medium neologisms would be much longer. The first high-capacity horse-drawn coaches were commonly known as omnibuses, Latin for all, and only later shortened to bus. Terseness is not always an advantage as I find in my day job as a programmer, longer descriptive names are easier to interpret than concise but ambiguous names. The term iPad is the patented creation of a marketing department. It owes its success to its extreme simplicity. Yet pad has many other meanings, anything from a soft wad of material, a booklet of writing paper as in notepad, a flat-topped structure such as launchpad or heli(copter)-pad, the flat area of circuit board or a small city apartment. The correct term for a device like an iPad or a Kindle Fire, both ephemeral devices, is electronic tablet, but tablet alone has plenty of other meanings. Smartphone may be more neutral than iPhone, a trademark, but is itself a neologism that fails to adequately describe its true nature. Indeed the forerunner to modern smartphones was a personal digital assistant or PDA, which is admittedly not quite as catchy. These new coinages rely heavily on their neurolinguistic impact. They must be short, relatively easy to pronounce and distinguishable from their technical predecessors. If you want to sell a new kind of coffee, a descriptive Anglo-Saxon concoction like concentrated coffee with frothy milk would be bad marketing, cappuccino sounds much better to your average English speaker.

Categories
All in the Mind

You’re just an Individual

Whether you read psychiatric literature, social work reports or listen to the speeches of leading politicians, you'll find ordinary citizens increasingly referred to not as women, men, people, human beings or citizens but as individuals. Whether you lead an atypical lifestyle, are considered to suffer from a disorder or disease, are addicted to an obsessive behaviour or harbour subversive opinions, someone somewhere will probably refer to you as an individual with some label or other. A quick Web search for "individual with" (in inverted commas to narrow the search to that exact string) returns a plethora of references to disabled or psychiatrically labelled subjects. But the application is gradually spreading to encompass a wider cross-section of misfits, miscreants and deliquents, even blurring essential distinctions between the groups. A misfit is someone who simply finds it hard to assimilate into mainstream society for whatever reason. A geek fascinated by outmoded programming languages and oblivious to dressing norms may be a misfit in a fashion-crazed culture. A miscreant fails to believe the official doctrine, someone who fails to believe in the veracity of the latest terrorist scare is a miscreant setting her or himself against the dominant media outlets. A deliquent deliberately behaves in a socially irresponsible and potentially destructive way or may be so engrossed in the pursuit of pleasure that she or he is simply unaware of the social consequences of her or his actions, e.g. An alcoholic gambler may soon become a deliquent forced into crime through mounting debt and a risky lifestyle. Yet to statisticians all these categories just comprise individual specimens of humanity in need of classification.

New Labour enforcers seem to have four responses to well-argued condemnations of the government's actions. They may define the opinion holder as an extremist aligned with authoritarian regimes or fundamentalist religious sects. They may write off the view as a mere conspiracy theory. They may call into question the challenger's appraisal of the facts appealing to their residual party loyalty. However, when none of these options appears expedient, a classic tactic is simply to acknowledge awareness of the individual's personal beliefs. So if Tony Blair claims that the failure of the British foreign secretary to vote for an immediate ceasefire in Lebanon following over 1000 civilian deaths is "the right thing to do to secure peace in the Middle East", we are supposed to believe he is privy to information to substantiate this claim. Yet if an opponent exposes the sheer hypocrisy of Blair's position in siding with aggressors, their views are dismissed as the personal opinions of individuals. Likewise if a woman becomes addicted to Internet gambling, her psychological dependence on this pastime and the resulting bankruptcy are considered personal problems of an individual with an obsessive compulsive disorder. Those responsible for deregulating and promoting the activity are just politicians and entreprenuers responding to public demand.

Most words have their uses, but the gradual semantic shift of this adjective and noun reflects a trend to alter language in order to blur distinctions and substitute implicit meanings. In NewSpeak an individual is a subject of investigation, while a man or woman are persons in their own right.