0 comments

Welcome to the 15-minute city

This is just the old "smart growth" idea resuscitated. Such policies still have a following but only because they sound good in theory.

Attempts to implement it have been shown to drive the price of housing up, significantly reducing discretionary incomes, which necessarily reduces the standard of living and increases poverty

And relatively few job opportunities can be shoe-horned into the "smart" area.  It is generally inconsistent with economies to  scale


When a plague tore through Milan in the 1570s, everything had to change. Shops were closed. Mass was sung outdoors. A large church, the Lazzaretto, became a hospital. By 1578 the disease had fallen back, but the city was in financial trouble and had shed almost a fifth of its population.

This year, in the chaotic fallout from coronavirus, the Lazzaretto is once again part of an ambitious urban experiment. Giuseppe Sala, Milan’s leftwing mayor, announced in April that the area would host a pilot scheme for “rethinking the rhythms” of the Lombard capital. Amid the dense cityscape that has built up around the remains of the old hospital, the plan is to “offer services and quality of life within the space of 15 minutes on foot from home”.

The “15-minute” idea is based on research into how city dwellers’ use of time could be reorganised to improve both living conditions and the environment. Developed by Professor Carlos Moreno at the Sorbonne in Paris, the concept of “la ville du quart d’heure” is one in which daily urban necessities are within a 15-minute reach on foot or by bike. Work, home, shops, entertainment, education and healthcare — in Moreno’s vision, these should all be available within the same time a commuter might once have waited on a railway platform.

“One of the first lessons of Covid-19 is that we could radically change our ethos for working,” Moreno says. “In a few days, most people changed their remit and their jobs.” The mass, global switch to “working from home” (or living at work, as it may feel) suddenly makes multi-hour commutes appear wasteful, and clock-watching office life inefficient. Ironically enough, the French automobile group PSA (which makes Peugeot, Vauxhall and Citroën cars) was early to seize the opportunity to shift its non-production workforce to permanent remote mode.

Moreno, scientific director of entrepreneurship and innovation at the Sorbonne, is also special envoy to Paris mayor Anne Hidalgo, and has influenced her vigorous implementation of pedestrian and bike schemes. Re-elected as mayor last month, Hidalgo pushed her “Paris Respire” programme even further during lockdown, turning miles of traffic lanes into cyclist-friendly “corona pistes”.

Moreno models the 15-minute city on his research into the “new relationship between citizens and the rhythm of life in cities”. To achieve a better rhythm, he says, we need to develop multipurpose services — “one building, with many applications through the day. How, for example, we could use a school for other activities, during the weekend. We also want buildings that mix places for living and working at the same time — this reduces the time for commuting.”

Above all, the 15-minute city is one that cuts down unnecessary journeys: “We need to reduce the presence of cars on the streets,” says Moreno. Hidalgo has already banned traffic along parts of the Seine and on some Sundays along the Champs-Élysées.

Other cities, such as Buenos Aires, have introduced free bike-rental schemes for both residents and tourists, while pioneering Amsterdam has a new model, the City Doughnut, which aims to reduce emissions and waste in the drive towards carbon neutrality.

But though the “quarter-hour” framework seems convenient and ecologically sound, it implies many limitations. Lockdown challenged an understanding of cities as places that provide the chance introductions and chains of encounters upon which interesting careers (and personal lives) are constructed. Is it realistic to think of this 15-minute lasso as a permanent, practicable feature? “We don’t want to oblige people to stay in the 15-minute district,” Moreno says. “We don’t want to recreate a village. We want to create a better urban organisation.”

SOURCE 

0 comments

Women board members increase businesses' profits tenfold, report finds

"Pull the  other one" was my immediate old-fashioned reaction to seeing this claim.  It appears to emanate from this document

The document concerned, however, is primitive from an academic POV.  It has lots of pretty grapics but no details of its  methodology: no definitions of the terms it uses, no breakdown into categories of economic activity, no breakdown into the recency or otherwise of the company and no breakdown into whether it served a poor, rich or medium clintele.  It appears in fact to have no controls at all.  I would very much like to see the raw data.  I am sure that the influence of feminist management would be much reduced if all other plausibly relevant factors were taken into account.

Just off the top of my head, let me suggest that firms selling cosmetics are exceptionally profitable.  The prices charged for some such products would certainly suggest large profit margins.  Women as a whole are very gullible about products that allegedly increase their beauty.   And beauty-promoting  firms would undoubtely have a strong female presence in their management.  So such female-led companies were not highly profitable because they were led by women.  They were exceptionally profitable because they were operating in an exceptionally profitable business sector

The method of analysis is important too.  Are we looking, for example, at extreme quintiles?  This is a lamentably common practice elsewhere and normally means that there is no overall relationship in the data as a whole.

So much more information is needed before these findings can be accepted


Companies with greater numbers of female board members bring in 10 times greater profits, a study has revealed.

The research found that executive committees composed of more than a third of women have a net profit margin of 15.2 per cent, while those with none make just 1.5.

The ‘Women Count 2020’ report claims that this performance gap is costing the UK economy a potential £47 billion of pre-tax profit.

Lorna Fitzsimons, co-founder of The Pipeline, which commissioned the report, said the difference is driven by the fact that companies that are more representative have a "better understanding of clients and customer need”.

SOURCE

1 comments


'Dead Wrong': Historian Calls Jihad a 'Myth'

Johan  Norberg,  a noted libertarian, calls Islamic aggression a myth. Norberg is a libertarian supporter of open borders who likes to sneer at “nativists” who oppose this insane policy. Like Chris Berg, another libertarian supporter of open borders, he thinks there is nothing special about nations.

I agree with libertarian thinking in general but some libertarians make it into a cult.  They become very rigid. They see liberty as the only needed explanation of human behaviour. That ignores important influences on behaviour -- such as genetics -- without which we cannot understand what is going on or influence what is going on

The electoral success of "Make America great again" should tell him that there really is something important about national identity -- and emotions generally


An especially stark example of how Leftists thrive on distorting history -- a tactic pivotal to their very being -- recently appeared.  In a video titled “Dead Wrong: The Anti-Muslim Myth,” Johan Norberg, a senior fellow at the Cato Institute who holds an MA in “the History of Ideas” from the University of Stockholm, begins as follows:

The Nativist right likes to tell the story of the West through the prism of a conflict between Christendom and Islam.  One of the founding myths is the Battle of Vienna in 1683, when the united Christian armies defeated the Muslim Ottoman Turks.  This historical narrative is dead wrong, because back then, people concerned themselves with other divisions.

The rest of the brief video -- one minute, forty-two seconds are devoted to proving the “anti-Muslim myth” -- tries to substantiate this, primarily by arguing that there were divisions within Christendom, specifically infighting between Catholics and Protestants, which prompted some of the latter to ally with the Ottomans against Vienna.

This argument fails on many levels.  For starters, Norberg overlooks two simple and interrelated facts: 1) realpolitik -- prioritizing the practical over the ideal -- is as old as human society; 2) that does not mean that ideals do not exist and motivate politics, including war.  It’s not a question of “either/or.”

Naturally, as northern Protestants and southern Muslims had the same common enemy between them -- Catholic Christendom, particularly in the guise of the Holy Roman Empire -- the timeless adage that “the enemy of my enemy is my friend” was evident during the siege of Vienna, as well as previous conflicts.  Elizabeth I of England (r. 1558–1603), for example, formed an alliance with the Muslim Barbary pirates -- who during her reign had enslaved hundreds of thousands of Europeans -- against Catholic Spain.

Even so, Norberg ignores the fact that it is precisely because of the Catholic/Protestant schism -- which was entirely religious -- that Catholics and Protestants came to fight each other in the first place.  While he lumps them together as “Christians” in an effort to show that Christian unity against Islam never existed, Catholics and Protestants did not see each other as “fellow Christians” but religious enemies of the first order -- worse than Muslims.  It is because of this ideological divide that one could ally with Islam against the other without breaking faith.

In short, during the siege of Vienna, realpolitik was evident only in the very limited sense that the Catholic king of France, Louis XIV -- who once said “If there were no Algiers [to terrorize his competitors, particularly Spain] I would make one” -- sided against Catholic Vienna.

Other than that, most if not all of the Christians and Muslims involved at Vienna saw the conflict in distinctly religious terms, beginning with the battle-hardened Catholic king of Poland, John Sobieski III. Although he had little to gain by fighting on behalf of and eventually delivering Vienna, he still lamented how Islamic “fury is raging everywhere, attacking alas, the Christian princes with fire and sword.”  He also believed that “it is not a city alone that we have to save, but the whole of Christianity, of which the city of Vienna is the bulwark. The war is a holy one.”  Before setting off, he sent a message to Imre Thokoly, the Hungarian Protestant who was stirring trouble around Poland’s border, “that if he burnt one straw in the territories of his allies, or in his own, he would go and burn him and all his family in his house.”

Similarly, although the Ottoman pretext for war was support for their ally, the aforementioned Thokoly, the grand vizier who eventually led nearly 300,000 Turks to conquer Vienna, Kara Mustafa -- reputed to be “fanatically anti-Christian” -- exposed his mind earlier: “They ought,” he had told Ottoman high command, “to take advantage of the disorders of the Christians [Catholic-Protestant schism] by the siege of the place [Vienna], the conquest of which would assure that of all Hungary [currently the Turks’ “ally”], and open them a passage to the greatest victories.”  Later, during an elaborate pre-jihad ceremony, Sultan Muhammad IV, “desiring him [Mustafa] to fight generously for the Mahometan faith,” placed “the standard of the Prophet… into his hands for the extirpation of infidels, and the increase of Muslemen.”

There are many other examples highlighting the religious/ideological nature of the Ottoman siege of Vienna: before initiating its bombardment, Kara Mustafa offered the city the standard Islamic ultimatum (convert, capitulate, or else); and the Ottomans are constantly depicted as crying out typical jihadi phrases, such as “Allahu Akbar.”

So much for Norberg’s categorical claim that “back then, people concerned themselves with other divisions [than religion].”

In the end, however, Norberg’s greatest failure is that his is a classic strawman argument.  Recall the title of his video: “Dead Wrong: The Anti-Muslim Myth.”  Recall his opening sentence: “The Nativist right likes to tell the story of the West through the prism of a conflict between Christendom and Islam.”  Yet, while pretending to debunk the religious nature of the perennial conflict between Christendom and Islam -- which dramatically manifested itself in countless ways and battles over the course of a millennium before the siege of Vienna in 1683 -- he talks only about that one encounter (and fails even there).

The reason is evident: before the aforementioned Catholic-Protestant rift began in the sixteenth century, Christian unity against Islam was relatively solid, providing little material for people like Norberg -- such as John Voll and William Polk, professors of Islamic history -- to manipulate in an effort to show that  the “anti-Muslim myth” is “dead wrong.”

Such are the Left’s tired tricks when conforming history to its narrative: take exceptions and aberrations, exaggerate and place them at center stage, and completely ignore the constants.  Above all, offer no context.

SOURCE

1 comments


Alcohol is good for your brain

This is such a fun finding I could not resist putting it up. It's from a major medical journal so carries some weight. The effects were very slight but in a good direction from my point of view.

Sad to say, there were no controls for demographics mentioned so the finding could be wholly artifactual. The drinkers could for instance have been slightly richer on average -- and we know that rich people tend to have better health and higher IQs


Association of Low to Moderate Alcohol Drinking With Cognitive Functions From Middle to Older Age Among US Adults

Ruiyuan Zhang et al.

Abstract

Objective:  To investigate whether associations exist between low to moderate alcohol drinking and cognitive function trajectories or rates of change in cognitive function from middle age to older age among US adults.

Design, Setting, and Participants:  A prospective cohort study of participants drawn from the Health and Retirement Study (HRS), a nationally representative sample of US adults, with mean (SD) follow-up of 9.1 (3.1) years. In total, 19 887 participants who had their cognitive functions measured in the HRS starting in 1996 through 2008 and who had participated in at least 3 biennial surveys were included. The data analysis was conducted from June to November 2019.

Exposures:  Alcohol consumption and aging.

Main Outcomes and Measures:  Trajectories and annual rates of change for the cognitive domains of mental status, word recall, and vocabulary and for the total cognitive score, which was the sum of the mental status and word recall scores. Participants were clustered into 2 cognitive function trajectories for each cognition measure assessed based on their scores at baseline and through at least 3 biennial surveys: a consistently low trajectory (representing low cognitive scores throughout the study period) and a consistently high trajectory (representing high cognitive scores throughout the study period).

Results: The mean (SD) age of 19 887 participants was 61.8 (10.2) years, and the majority of the HRS participants were women (11 943 [60.1%]) and of white race/ethnicity (16 950 [85.2%]). Low to moderate drinking (<8 drinks per week for women and <15 drinks per week for men) was significantly associated with a consistently high cognitive function trajectory and a lower rate of cognitive decline. Compared with never drinkers, low to moderate drinkers were less likely to have a consistently low trajectory for total cognitive function (odds ratio [OR], 0.66; 95% CI, 0.59-0.74), mental status (OR, 0.71; 95% CI, 0.63-0.81), word recall (OR, 0.74; 95% CI, 0.69-0.80), and vocabulary (OR, 0.64; 95% CI, 0.56-0.74) (all P < .001). In addition, low to moderate drinking was associated with decreased annual rates of total cognitive function decline (β coefficient, 0.04; 95% CI, 0.02-0.07; P = .002), mental status (β coefficient, 0.02; 95% CI, 0.01-0.03; P = .002), word recall (β coefficient, 0.02; 95% CI, 0.01-0.04; P = .01), and vocabulary (β coefficient, 0.01; 95% CI, 0.00-0.03; P = .08). A significant racial/ethnic difference was observed for trajectories of mental status (P = .02 for interaction), in which low to moderate drinking was associated with lower odds of having a consistently low trajectory for white participants (OR, 0.65; 95% CI, 0.56-0.75) but not for black participants (OR, 1.02; 95% CI, 0.74-1.39). Finally, the dosage of alcohol consumed had a U-shaped association with all cognitive function domains for all participants, with an optimal dose of 10 to 14 drinks per week.

Conclusions and relevance:  These findings suggested that low to moderate alcohol drinking was associated with better global cognition scores, and these associations appeared stronger for white participants than for black participants. Studies examining the mechanisms underlying the association between alcohol drinking and cognition in middle-aged or older adults are needed.

SOURCE


0 comments

Why does the coronavirus sometimes strike young people?

It's usually clearcut.  The virus only strikes people with impaired immune systems -- people who have other ailments.  Old people normally have other ailments so they are very often affected by the virus.

And when young people get it, they are usually ones who are ill already. They too have other ailments.  But how come there are a few cases of young people being infected who seem otherwise healthy? Why does the virus single them out?  Why in their case was being young and healthy not enough to protect them?

The article from a major medical journal below shows why in at least some cases.  It shows that they have a genetic defect that weakens their immune system in crucial ways. That may not be the answer in all cases but it is clearly now an in principle explanation.  The vast majority of young people are safe


Presence of Genetic Variants Among Young Men With Severe COVID-19

Caspar I.van der Made et al.

Abstract

Objective:  To explore the presence of genetic variants associated with primary immunodeficiencies among young patients with COVID-19.

Design, Setting, and Participants:  Case series of pairs of brothers without medical history meeting the selection criteria of young (age <35 12="" 16="" 2020.="" 23="" 2="" 4="" admitted="" analysis="" and="" april="" as="" available="" between="" brother="" care="" controls="" covid-19.="" date="" due="" experiments.="" families="" family="" final="" follow-up="" for="" four="" from="" functional="" genetic="" hospitals="" icus="" in="" included="" intensive="" march="" may="" members="" men="" netherlands="" of="" p="" pairs="" segregation="" severe="" the="" to="" unit="" unrelated="" variant="" was="" were="" years="">
Main Outcome and Measures:  Results of rapid clinical whole-exome sequencing, performed to identify a potential monogenic cause. Subsequently, basic genetic and immunological tests were performed in primary immune cells isolated from the patients and family members to characterize any immune defects.

Results:  The 4 male patients had a mean age of 26 years (range, 21-32), with no history of major chronic disease. They were previously well before developing respiratory insufficiency due to severe COVID-19, requiring mechanical ventilation in the ICU. The mean duration of ventilatory support was 10 days (range, 9-11); the mean duration of ICU stay was 13 days (range, 10-16). One patient died. Rapid clinical whole-exome sequencing of the patients and segregation in available family members identified loss-of-function variants of the X-chromosomal TLR7. In members of family 1, a maternally inherited 4-nucleotide deletion was identified (c.2129_2132del; p.[Gln710Argfs*18]); the affected members of family 2 carried a missense variant (c.2383G>T; p.[Val795Phe]). In primary peripheral blood mononuclear cells from the patients, downstream type I interferon (IFN) signaling was transcriptionally downregulated, as measured by significantly decreased mRNA expression of IRF7, IFNB1, and ISG15 on stimulation with the TLR7 agonist imiquimod as compared with family members and controls. The production of IFN-γ, a type II IFN, was decreased in patients in response to stimulation with imiquimod.

Conclusions and Relevance:  In this case series of 4 young male patients with severe COVID-19, rare putative loss-of-function variants of X-chromosomal TLR7 were identified that were associated with impaired type I and II IFN responses. These preliminary findings provide insights into the pathogenesis of COVID-19

SOURCE

0 comments

Historians blast BBC for 'unbalanced' News At Ten report claiming Churchill was responsible for 'mass killing' of up to three million in 1943 Bengal famine

These attacks on Churchill are absurd.  Churchill was fighting two wars, with Germany and Japan, so simply had no resources left to give to India.  Britain was no food bowl at that time. It imported much of its food.  And transporting anything by ship was a huge challenge with German U-boats sinking many of the transports.

And it was not his responsibility anyway.  It was the responsibility of the government of Bengal.  That government might conceivably have imported grain from Australia but -- again -- where would they get the ships to carry it?  And finding the grain in India would be a very unlikely enterprise. India was always on the brink of starvation and with many men away at the war, that would have meant no grain to spare.

I was in my youth an admirer of Churchill but I have much revised that view now that I have heard of the repatriation of the Cossacks (Southern Russians).  The Cossacks were very anti-Soviet and many joined the Wehrmacht to fight the Red army.  The Wehrmacht lost the war, however so towards the end many of the Cossacks in the Wehrmacht escaped to British lines in Austria.  They knew that Stalin would murder them and thought that they would be safe as British prisoners of war.

But Churchill betrayed them.  He sent them back to Stalin and almost certain death.  Why did he do it?  Because Russia had considerable numbers of British and French prisoners of war and Churchill wanted them released.  It was a prisoner swap.  But it was not the usual swap.  Swapped prisoners are normally welcomed back to their homeland.  The Cossack were killed instead.  And, knowing that would happen, Churchill should have done some other deal -- presumably repatriating all non-Cossack prisoners

So Churchill was no saint. He was a politician.  The Cossacks were a huge blot on his record.  But nobody is perfect and he is certainly well worthy of the honour that is normally given to him.  His unrelenting opposition to Communism is a large part of that

In 2002 the BBC ran a massive national poll asking citizens of the United Kingdom to vote for the 100 greatest Britons of all time. At the top of the list was Winston Churchill.


Historians have criticised the BBC for an 'unbalanced' News At Ten report claiming Churchill was responsible for the 'mass killing' of up to three million people in the 1943 Bengal Famine.

A section broadcast on Tuesday examined how modern Indians view the wartime prime minster as part of a series on Britain's colonial legacy, and featured a series of damning statements about his actions.

Rudrangshu Mukherjee of Ashoka University in India, said Churchill was seen as a 'precipitator' of mass killing' due to his policies, while Oxford's Yasmin Khan claimed he could be guilty of 'prioritising white lives over Asian lives' by not sending relief.

But today historians said the report ignored the complexities behind the famine in favour of squarely blaming Churchill. World War Two expert James Holland argued he had tried to help but faced a lack of resources due to the war against Japan.

It comes amid a wider campaign to trash the war hero's legacy, with his statue defaced with the word 'racist' by Black Lives Matter protesters in London and civil servants calling for the Treasury's 'Churchill Room' to be renamed.

The Bengal Famine was triggered by a cyclone and flooding in Bengal in 1942, which destroyed crops and infrastructure.

Historians agree that many of the three million deaths could have been averted with a more effective relief effort, but are divided over the extent to which Churchill was personally to blame.

Yogita Limaye, the BBC News India correspondent who led the report, said many Indians blamed him for 'making the situation worse'.

But historians suggested the report attributed too much of the blame onto Churchill when other factors were more significant.

Tirthankar Roy, a professor in economic history at the LSE, argues India's vulnerability to weather-induced famine was due to its unequal distribution of food.

He also blamed a lack of investment in agriculture and failings by the local government.

'Winston Churchill was not a relevant factor behind the 1943 Bengal famine,' he told The Times. 'The agency with the most responsibility for causing the famine and not doing enough was the government of Bengal.'

Churchill has been blamed for down-playing the crisis and arguing against re-supplying Bengal to preserve ships and food supplies for the war effort.

However, his defenders insist that he did try to help and delays were a result of conditions during the war.

They point out that after receiving news of the spreading food shortages he told his Cabinet he would welcome a statement from Lord Wavell, the new Viceroy of India, about how he planned to ensure the problems were 'dealt with'. He then wrote a personal letter urging the Viceroy to take action.

The historian James Holland weighed into the row today, insisting that Churchill faced immense difficulties supplying Bengal due to the amount of British resources tied up in the fight against the Japanese in the Pacific.

'In light of the latest furore over the Bengal Famine and people wrongly still insisting it was Churchill's fault, here's this on the subject,' he tweeted.

'His accusers don't a) understand how the war worked, or b) that his hands were tied over use of Allied shipping.'

Sir Max Hastings, the military historian, accepted that Churchill's behaviour was a 'blot on his record' but argued it should be considered against his achievements in helping to defeat fascism.

The recent Black Lives Matter protests have seen a renewed focus on Churchill's legacy, including calls for his statue to be taken down from Parliament Square.

At one point the monument was even boxed in by London Mayor Sadiq Khan to protect it from vandalism during a weekend of demonstrations. Figures of Gandhi and Mandela were also encased with wooden sheeting, at a cost of £30,000.

Threats to the statue triggered a strong reaction from defenders of the national hero who pointed out that his greatest achievement was defeating racist, anti-Semitic fascism.

At the time, Boris Johnson criticised the calls as being the 'height of lunacy'. The Prime Minister said he would resist any attempt to remove the statue 'with every breath in my body'.

Churchill's legacy has been attacked in other quarters, with a group of civil servants recently complaining that they did not feel 'comfortable' with having a room in the Treasury named after him.

BBC News insiders told MailOnline its report on the Bengal Famine made clear Churchill didn't cause the disaster but has been accused by some of making it worse.

A BBC spokesman said: 'The item was the latest in a series looking at Britain's colonial legacy worldwide.

'The series includes different perspectives from around the world, in this case from India, including a survivor from the Bengal famine, as well as Oxford historian Dr Yasmin Khan.

'The report also clearly explained Churchill's actions in India in the context of his Second World War strategy. We believe these are all important perspectives to explore and we stand by our journalism.'

 SOURCE

1 comments

Melbourne intensive care nurse's blunt warning of big coronavirus risk to younger adults

This is contrary to all previous observations so requires explanation.

The explanation probably lies in the origin of the current outbreak. It originated in big blocks of welfare housing.

Many of the residents would be there because they had health challenges.  So they fit the usual observation that substantial co-morbidities normally are required for the virus to take hold.

So my hypothesis would be that the young patients came from welfare housing.  The virus normally hits the elderly most because most elderly do have substantial co-morbidities.


A senior Melbourne intensive care nurse says hospitals are preparing for the prospect of deaths among younger Victorians as authorities battle to rein in the state's coronavirus cases.

The head intensive care unit nurse at the Royal Melbourne Hospital, Michelle Spence, said there was a growing number of younger adults being hospitalised by the virus.

"What we are seeing now is young people who are going to die. There is no doubt about it," she said. "And these are people who are 30s, 40s, 50s, who have no past history."

She said deaths in Victoria had so far predominately been in older people, but that would change.

Yesterday, authorities revealed 20 per cent of people in Victorian hospitals with the virus were aged under 50, including four children.

The figures also showed a quarter of COVID-19 infections were being recorded in people aged in their 20s.

The Royal Melbourne Hospital has acquired a further 22 ventilators as the intensive care unit prepares for a surge in cases.

Ms Spence, who is the hospital's ICU nurse manager, said the hospital had patients ranging from their 30s to their 80s "and all of them are at varying degrees of their COVID journey".

"We're definitely not just seeing the elderly, that is not the case at all."  "It is definitely not an old person's disease," Ms Spence said.

She said a COVID patient's time in the intensive care unit was a long, slow process, where very ill people were separated from their families.

"Being in ICU is not a nice place to be," she said. "It is absolutely not a comfortable thing to do."

Ms Spence warned the process of recovery, even after patients leave ICU, could take a long time.

She urged Melburnians of all ages to follow the directive to wear a face mask when outside their homes, saying wearing a mask was "way more comfortable than being on a ventilator".

SOURCE 


1 comments


How Earth’s Climate Changes Naturally (and Why Things Are Different Now)

The heading above is from a long but emptyheaded article that catalogs in a  handwaving way the various influences on earth's climate.

One might expect that a consideration of all the natural influences would inspire doubt about the anthropogenic global warming thesis. One would think that a signal emanating from human deeds would be hard to distinguish from all the other influences at work.

No such luck. The article is straight warmism.  The idea seems to be to create an air of profundity in its claims.  By discussing all the other climate influences and still showing anthropogenic global warming at work the article reassures  us that a full scholarly exercise has been undertaken before concluding that anthropogenic global warming exists.  All "t"s have been crossed and all "i"s have been dotted.

But the article in fact gives no evidence at all for anthropogenic global warming.  The most it offers is a link to another paper which in turn relies on the IPCC reports. So it is all just the same old same old.  It's a long article but there's no reason to spend any time on it.

Earth’s climate has fluctuated through deep time, pushed by these 10 different causes. Here’s how each compares with modern climate change. Orbital wobbles, plate tectonics, evolutionary changes and other factors have sent the planet in and out of ice ages.

Earth has been a snowball and a hothouse at different times in its past. So if the climate changed before humans, how can we be sure we’re responsible for the dramatic warming that’s happening today?

In part it’s because we can clearly show the causal link between carbon dioxide emissions from human activity and the 1.28 degree Celsius (and rising) global temperature increase since preindustrial times. Carbon dioxide molecules absorb infrared radiation, so with more of them in the atmosphere, they trap more of the heat radiating off the planet’s surface below.

But paleoclimatologists have also made great strides in understanding the processes that drove climate change in Earth’s past. Here’s a primer on 10 ways climate varies naturally, and how each compares with what’s happening now.

MORE here

0 comments


Indigenous owners lose bid to protect land earmarked for Shenhua mine

You can guarantee that any new mine, dam or road will be found to trespass on an Aboriginal sacred site.  There's money in such claims.  They usually result in a "compensation" payout for the Aborigines and their lawyers

But the company fought this one so everyone is out of pocket

This does clearly need reform.  A rule specifying that there will be no monetary reward for such claims would probably result in most such claims never even being raised.  An apology would have to suffice


An Aboriginal group has lost its bid to protect a culturally valuable site from being destroyed for the Shenhua coal mine in northern NSW, but says the fight to protect the area is not over.

Federal Court Judge Wendy Abraham dismissed the application for a judicial review of the Environment Minister's decision not to protect the Mount Watermark site near Gunnedah from the controversial open-cut mine.

The applicant, Veronica 'Dolly' Talbott, acting as a member of the Gomeroi Traditional Custodians, had submitted that Environment Minister Sussan Ley took into account an "irrelevant consideration" when she weighed the impact of the mine on Indigenous sites against perceived social and economic benefits to the local community.

In dismissing the application, Justice Abraham said the applicant had failed to establish that the social and economic impacts are irrelevant under the Heritage Act.

The judgment said Minister Ley had stated she "considered the expected social and economic benefits of the Shenhua Watermark Coal Mine to the local community outweighed the impacts of the mine ... as a result of the likely destruction of parts of their Indigenous cultural heritage."

Ms Talbott said the decision "demonstrates the abject failure of the [Aboriginal and Torres Strait Islander Heritage Protection] Act to provide meaningful protection for areas of particular significance to Aboriginal people."

She said the decision had not deterred Gomeroi elders from continuing to seek protection for the area, and the group, which represents more than 600 Gomeroi people and 31 Aboriginal nations, had made a new application to the Environment Minister to protect the area's sacred sites.

"If this mega-mine proceeds, our interlinked sacred places will be completely destroyed and obliterated from the landscape."

Ms Talbott said there is "an urgent need" to protect places of significance to Aboriginal people, especially following the destruction of the Juukan caves by mining giant Rio Tinto earlier this year.

A spokesman for Minister Ley said the ruling confirms her decision was made in accordance with the provisions of the act and has already announced her intention "to commence a national engagement process for modernising the protection of Indigenous cultural heritage, commencing with a round table meeting of state Indigenous and environment ministers."

The meeting will be jointly chaired by Minister Ley and the Minister for Indigenous Australians Ken Wyatt.

SOURCE   

1 comments


Dr. Ridd: James Cook University wins unlawful sacking decision

The grounds for the university's actions were contemptible.  He was sacked for disagreeing with his colleagues.  If academics cannot disagree with one-another, where does that leave the search for truth?

He was not even abusive in what he said. He just said that their conclusions needed more validation -- a scientific comment if ever there was one.

This needs to go to appeal but funding may be a barrier to that

The reason for the furore is that the JCU scientists said that the reef was damaged by global warming.  Dr. Ridd demurred


The Federal Court has allowed an appeal of a decision which found James Cook University acted unlawfully in its 2018 sacking of Peter Ridd, after the professor questioned colleagues' research on the impact of global warming on the Great Barrier Reef.

Dr Ridd was awarded $1.2 million in damages by the Federal Circuit Court in September, which had earlier found JCU sacked the physics professor unlawfully.

The case attracted intense focus due to Dr Ridd's scepticism of climate change science and the broader debate about free speech at Australian universities.

The university reiterated last year it would launch the appeal, and has maintained its sacking of the professor was based on his treatment of colleagues rather than the expression of his scientific views.

Dr Ridd had originally sought reinstatement to his position but later abandoned this in favour of compensation.

In a judgment published on Wednesday, the Federal Court set aside that compensation decision and allowed the university to appeal the earlier ruling it had acted unlawfully.

Justices John Griffiths and Roger Derrington found Dr Ridd's enterprise agreement did not give him "untrammelled right" to express his professional opinions beyond the standards imposed by the university's code of conduct.

The termination of his employment did therefore not breach the Fair Work Act, they said.

Outlining his final declarations and penalties last year in September, Federal Circuit Court Judge Salvatore Vasta suggested the university's conduct had bordered on "paranoia and hysteria fuelled by systemic vindictiveness".

"In this case, Professor Ridd has endured over three years of unfair treatment by JCU – an academic institution that failed to respect the rights to intellectual freedom that Professor Ridd had as per [his enterprise agreement]," the judge decided.

Conservative think-tank the Institute of Public Affairs described the new Federal Court judgment on Wednesday as a "devastating blow" to freedom of speech.

"Alarmingly, this decision shows that contractual provisions guaranteeing intellectual freedom do not protect academics against censorship by university administrators," IPA director of policy Gideon Rozner said. "The time has come for the Morrison government to intervene."

He added that Dr Ridd was now considering his legal options around a High Court challenge.

SOURCE 


1 comments


Is peer review bad for science?

I have had considerable experience of peer review, both as an author (of c.300 articles) and as a reviewer and my overwhelming impression of the process is that most reviewers do not read what they review.  They just look at the conclusions of the article and if they sound right, the reviewer passes the article with just a few desultory comments.

There are some reviewers who put up detailed and apposite comments but their comments often betray an ignorance of the previous research on the subject. They may know some recent reports but not the deep background to the field.  For that reason I always tried to supply the deep background in my articles and that did seem to pay off in that about half of my articles got accepted on the first submission.

Many of the articles that were rejected were ones where reviewers seemed not to be interested in either the previous research or my findings, apparently  because my conclusions were uncongenial to them

Articles that went strongly against the consensus certainly got much more negative treatment than ones that did not rock the boat


After studying the popular practice of peer review of scientific journal articles for several years, I have reluctantly concluded that peer review is bad for science. While the practice has its good side, there are several ways that it greatly impedes progress, and the bad greatly outweighs the good.

To begin with, let’s look at what peer review tries to do. The obvious thing is to block the publication of fake science. However this appears to be a rare event in most sciences. There are several million journal articles published each year, all peer reviewed, typically by two or three reviewers. Clearly these many millions of reviews did not keep any of these myriad articles from being published.

Paradoxically, however, most of these articles were in fact rejected based on peer review; many were rejected many times. Top journals often boast of having high rejection rates, like 80% or so. If this is the general practice then the average article must be submitted to something like five journals before it is accepted and published. If each submission is peer reviewed then that is a lot of reviews per article, perhaps ten to fifteen on average.

Given that all of these multiply rejected articles eventually get published, something other than simple gate keeping must be going on. This something looks to be an extremely laborious sorting process, whereby each article eventually finds the “right” journal. It is hard to see any value being added by these many millions of peer reviews. Given modern search technologies, which journal an article ultimately appears in no longer seems very important.

One negative aspect of peer review is well known. This is where gate keeping keeps great new ideas from being published. Max Planck, who discovered the quantum nature of energy, put it very nicely, saying something like “Your ideas will (only) be accepted when your students become journal editors.” This is the dark side of peer review blocking science, the novel good ideas get blocked as bad ideas.

But there are several other bad things that flow from peer review that I have not seen mentioned. These down sides are features of the incredibly time consuming and laborious nature of the practice.

First there is the huge time delay between the time a paper is written and when it is finally published. Let’s say that peer review takes four months, which is probably pretty fast. If the average paper is reviewed five times then that is almost two years of reviews before it is finally accepted. (Also, there are many other steps between these reviews, so the average might be more like four years from first submission to final publication.)

If two million papers are published each year, with an average delay of say two years each, due to peer review, that is an accumulation of four million years of delay every year. It is reasonable to believe that eliminating this vast tide of delay would dramatically speed up the progress of science.

Then there is the cost. Organizing and managing the peer review process is probably the greatest expense that journal publishers face. Keep in mind that given an 80% rejection rate, something like five articles will be reviewed for every one published. At three reviews each that means fifteen reviews per published articles.

The high cost of journals and articles is a major obstacle to access by all but the richest universities and researchers. This to probably greatly impedes the progress of science.

Then there is the huge amount of time that researchers spend reviewing each other’s articles. Reviews are expected to be comprehensive, so they probably take from 10 to 20 hours each, maybe more. If there are fifteen reviews per article published that is 150 to 300 hours of review time.

Multiply that by 2 million articles published and we get an incredible 300 to 600 million hours a year devoted to reviewing, rather that to research. Assuming that a work year is 2000 hours, this is like taking 150 to 300 thousand researchers off the job, just to peer review each other’s papers. Think of what that amount of research might create. Again, this is a huge loss to the progress of science.

Conclusion: Peer review adds an enormous amount of delay, cost and distraction to the process of science. It does not do enough good to justify these huge adverse impacts on the rate of scientific progress. Thus on balance peer review is bad for science.

SOURCE

0 comments


"Jaw-dropping" global crash in children being born — just what the ecofascists want

Stupid history-defying predictions below which assume that the influences on human reproduction will remain the same. They will not.

Post-pill, all the non-maternal women are currently being removed from the gene pool by reason of the simple fact that they now rarely have children. Women like them will become rarer and rarer. And their decisions are a big influence behind the current "birth-dearth".

So all the births of the not too distant future will come from maternally-inclined women. And how many children will those women have? As many as they can afford (and then some in some cases).

So the birthrates in advanced nations will recover and the  population will start growing again -- albeit off a lower base.

And here's a way-out one:  It seems to have become fashionable for celebrity women to have children, multiple children in most cases.  You would not think that women who live by their looks would risk  their figures by having children, but they are in fact doing it -- the Kardashians, for instance.

Children now seem to have become a sign of affluence.  They are the ultimate luxury -- even better than big yachts and Gulfstream jets.  And lots of people DO emulate celebrities.  Many women in the near future may start having children because it is fashionable or simply because they want to show off.  One can imagine the conversations:  "I've got three.  How many have you got?"

So who knows what the future holds?


The world is ill-prepared for the global crash in children being born which is set to have a "jaw-dropping" impact on societies, say researchers.

Falling fertility rates mean nearly every country could have shrinking populations by the end of the century.

And 23 nations - including Spain and Japan - are expected to see their populations halve by 2100.

Countries will also age dramatically, with as many people turning 80 as there are being born.

What is going on?

The fertility rate - the average number of children a woman gives birth to - is falling.

If the number falls below approximately 2.1, then the size of the population starts to fall.

In 1950, women were having an average of 4.7 children in their lifetime.

Researchers at the University of Washington's Institute for Health Metrics and Evaluation showed the global fertility rate nearly halved to 2.4 in 2017 - and their study, published in the Lancet, projects it will fall below 1.7 by 2100.

As a result, the researchers expect the number of people on the planet to peak at 9.7 billion around 2064, before falling down to 8.8 billion by the end of the century.

"That's a pretty big thing; most of the world is transitioning into natural population decline," researcher Prof Christopher Murray told the BBC.

"I think it's incredibly hard to think this through and recognise how big a thing this is; it's extraordinary, we'll have to reorganise societies."

Why are fertility rates falling?

It has nothing to do with sperm counts or the usual things that come to mind when discussing fertility.

Instead it is being driven by more women in education and work, as well as greater access to contraception, leading to women choosing to have fewer children.

In many ways, falling fertility rates are a success story.

Which countries will be most affected?

Japan's population is projected to fall from a peak of 128 million in 2017 to less than 53 million by the end of the century.

Italy is expected to see an equally dramatic population crash from 61 million to 28 million over the same timeframe.

They are two of 23 countries - which also include Spain, Portugal, Thailand and South Korea - expected to see their population more than halve.

"That is jaw-dropping," Prof Christopher Murray told me.

China, currently the most populous nation in the world, is expected to peak at 1.4 billion in four years' time before nearly halving to 732 million by 2100. India will take its place.

The UK is predicted to peak at 75 million in 2063, and fall to 71 million by 2100.

However, this will be a truly global issue, with 183 out of 195 countries having a fertility rate below the replacement level.

Why is this a problem?

You might think this is great for the environment. A smaller population would reduce carbon emissions as well as deforestation for farmland.

"That would be true except for the inverted age structure (more old people than young people) and all the uniformly negative consequences of an inverted age structure," says Prof Murray.

Prof Murray adds: "It will create enormous social change. It makes me worried because I have an eight-year-old daughter and I wonder what the world will be like."

Who pays tax in a massively aged world? Who pays for healthcare for the elderly? Who looks after the elderly? Will people still be able to retire from work?

"We need a soft landing," argues Prof Murray.

Are there any solutions?

Countries, including the UK, have used migration to boost their population and compensate for falling fertility rates.

However, this stops being the answer once nearly every country's population is shrinking.

"We will go from the period where it's a choice to open borders, or not, to frank competition for migrants, as there won't be enough," argues Prof Murray.

Some countries have tried policies such as enhanced maternity and paternity leave, free childcare, financial incentives and extra employment rights, but there is no clear answer.

Sweden has dragged its fertility rate up from 1.7 to 1.9, but other countries that have put significant effort into tackling the "baby bust" have struggled. Singapore still has a fertility rate of around 1.3.

Prof Murray says: "I find people laugh it off; they can't imagine it could be true, they think women will just decide to have more kids.

"If you can't [find a solution] then eventually the species disappears, but that's a few centuries away."

SOURCE

0 comments


British Fire service bans Black Country flag due to 'potential link to slavery'

Chains can be a sign of unity.  See the Eton boating song

The black country was central to the industrial revolution and at that time pollution filled the sky and the region was described as 'Black by Day' and 'Red by Night' by Elihu Burritt. Hence the colours in the flag

Their foundries also forged the anchors and chains for great ships like the Titanic, hence the chains on the flag
.



A fire service has been criticised for banning a Black Country flag designed by 12-year-old schoolgirl because it features a chain with a "potential link to slavery".

The red, black and white emblem was designed by Gracie Sheppard in 2013 to commemorate the industrial heritage of the West Midlands area.

Selected by public vote, it is proudly displayed on homes and buildings across the region on July 14, known as Black Country Day.

However, West Midlands Fire Service has now refused to display the flag at its stations, claiming the chains pictured on it may have historically been associated with the slave trade.

The service pledged its support to the Black Lives Matter movement and said it wants its staff to instead celebrate Black Country Day "in alternative ways" until they have established what the chains represent....

SOURCE

1 comments


Opinions divided on whether Australia could effectively ban extremist far-right organisations

It's good that such bans are only talk at this stage.  The big issue is in defining who is "extremist far-right".  In America, conservative family-oriented organizations are sometimes branded as "white supremacist" or the like simply because they are conservative. One man's moderate can be another man's extremist.

To me all American Leftists are racist extremists because of their support for "affirmative action".  So any bans should be founded on a very clear definition of "far right" and "extremist" that is widely agreed on both sides of the political spectrum.

To me the only justifiable bans, if any, are on people who actually practice violence.  Big talk is common but it is mostly just hot air.  And where do we find any Australian Rightists practicing violence, let alone ones who are members of a violent group?  The repeated acts of violence by Muslims surely make them a group of political extremists but that seems to be OK somehow.

The only Australian "Far Rightist" who actually attacked and killed people as far as I can remember was Brenton Tarrant and he was very much a lone wolf.  And he was as much a Greenie as a Rightist.  And he didn't even carry out his attacks in Australia, sadly for New Zealanders.

So there are undoubtedly some Australians with views that could be described as "far Right" but what harm have they done?  They don't seem capable of energizing even one-another into violence, let alone people in the population at large.

Neo Nazis are undoubtedly extremists with some views that identify them as Rightist so what harm have they done in Australia?  I did a close-up study of them some years ago (See here and here) and found not even advocacy of violence among them.  They would say "I wish.." for violence against someone but showed not the slightest disposion to do anything about it personally.

So if even Australian neo-Nazis are non-violent in practice where are the "extremist far-right" organizations that need to be banned?


Terror analysts say there is growing pressure on Australia to ban extremist far-right organisations as other nations take decisive action on the issue.

Labor's home affairs spokeswoman Kristina Keneally this week called on the Morrison government to send a signal that extremist views won’t be tolerated by officially listing and banning right-wing organisations.

The United Kingdom, the United States and Canada have all moved to ban extremist right-wing groups in their jurisdictions.

Deakin University counter-terrorism expert Professor Greg Barton said Western democracies around the world are increasingly being forced to consider taking stronger against the extreme far-right.

“There certainly is increasing pressure from Western democracies to ban right-wing extremist groups both in the political realm and the social media realm,” he told SBS News.

“(But) this is the very challenging area, we don’t have such clear egregious examples that we can easily move – often I think in practice this will apply to individuals not organisations.”

Currently, there are no such groups on Australia’s banned terrorist organisation list.

There are currently 26 groups on the Australian list - 25 of those are Islamist organisations and the other is the Kurdistan Worker's Party.

ASIO has warned that right-wing extremism poses an increasing threat in Australia as groups become more organised.

Counter-terrorism expert Leanne Close from the Australian Strategic Policy Institute told SBS News there were at least a dozen right-wing groups emerging in Australia.

She said they can be defined by a nationalistic and anti-Islamic approach, a focus on cultural superiority and behaviour that trends towards violence. 

“I know ASIO will always be keeping an eye on whether these groups are moving to a call to action,” she said.

“(But) the situation in Australia at the moment is... not as dire as places like the US and the experience that possibly the UK is having in relation to right-wing extremism.”

Earlier this week, the British home secretary Priti Patel moved to outlaw the far-right terror group Feuerkrieg Division, which has advocated the use of violence and mass murder as part of an apocalyptic race war.

In February, the United Kingdom also formally banned extremist right group the Sonnenkrieg Division and recognised the System Resistance Network as an alias of National Action – another right-wing group on the list.

In April this year the United States designated the Russian Imperial Movement, a white supremacist group, as global terrorists.

Canada has itself listed right wing extremist groups Blood and Honour and Combat 18 as terrorist groups.

Senator Keneally said the time has come for Australia to take stronger action against those that posed a right wing-extremist threat.

"The proscription of a right-wing organisation - international or domestic - would send a powerful message that these extremist views will not be tolerated,"" she wrote in an article for ASPI's The Strategist.

The coronavirus pandemic has also fuelled the spread of extremist messages.

Counter-terrorism analyst Professor Clive Williams has warned against specific bans on targeted groups.

“I don’t think it is a good idea to ban right wing groups because once you ban them it drives them underground and makes them much more cautious about their communication,” he told SBS News.

“The threat really from right-wing groups can be monitored fairly well because at the moment they are not particularly security conscience and they are relatively easy to infiltrate.“

Under Australia's national security laws, before an organisation is listed, the home affairs minister must be satisfied on reasonable grounds that it "is directly or indirectly engaged in preparing, planning, assisting or fostering the doing of a terrorist act, or advocates the doing of a terrorist act".

Mr Barton said the splintered nature of right-wing extremist groups means authorities in Australia remained more likely target the behaviour of individuals rather than implement targeted bans.

“Most of this is not going to be about banning a group … it’s going to be working out the individual behavioural level and communications,” he said.

“There does seem to be an awareness we are going to have to do something.”

SOURCE 

1 comments


Eugenics: An embarrassed silence and a hidden history

Eugenics was one of the great enthusiasms of the prewar Left.  In the 1920s and 1930's, Socialists everywhere embraced it, including National Socialists in Germany.

The abiding mark of socialists is that they want to change the world -- change it into what they think will be a better place. And eugenics fitted that perfectly.  If they removed the "weeds" from the human race, that would make a greatly improved world.  And Leftist legislatures worldwide passed eugenic laws of varying severity: Sweden, Germany and the United States being prominent examples

The man who took it furthest was however Adolf Hitler. He killed "useless eaters" in droves.  So the military defeat of Hitler led to him and all his works being discredited: So suddenly there were no longer any differences between races and eugenics was immoral.  Hitler had been a such a great menace and ended up such an abject failure that any similarities with him had to be denied.

Leftists everywhere dropped eugenics like a sizzling potato.  They no longer advocated it. More significant, however was that they succeeded in casting a cloak of silence over it.  They succeeded in blanking out all memory  of their association with eugenics. Hardly anyone now knows what a great enthusiasm eugenics once was for the Left.  Were it well known, their great enthusiasms of the present -- such as global warming and transgenderism -- might also be viewed skeptically


Eugenics and scientific racism in the United States emerged in the second half of the nineteenth century and lasted through the 1930s. It claimed that heredity was the fundamental determinant of an individual’s ability to contribute to society. Eugenics claimed the scientific ability to classify individuals and groups as “fit” or “unfit.” The unfit were defined by race, mental and physical disabilities, country of origin, and poverty. Eugenics was widely accepted by academics, politicians, intellectuals, government, the U.S. Supreme Court, and especially progressives, who supported eugenics-inspired policies as policy instruments to be utilized by an interventionist administrative state to establish a healthy and productive society. Those who questioned the “settled science” of eugenics were dismissed as “deniers,” much like those who question the “settled science” of climate change are today dismissed as “deniers.”

Eugenics and slavery share much common ground in their inherent racist view of blacks; however, the inherent racist perspective of eugenics was broader in that the set of those considered unfit included individuals and groups beyond those who were black. Eugenics provided the scientific foundation for involuntary sterilization policies in thirty-two states, supported the racist immigration policies in the first part of the twentieth century, and supported a variety of de jure and de facto policies designed to limit those defined as “unfit” to less than full-citizenship status. More troubling, eugenics and eugenics-inspired policies in the United States were admired by Adolf Hitler. American and German eugenicists interacted and exchanged views up to the late 1930s, and sterilization laws, immigration restrictions based on race or ethnicity, and efforts to prevent full citizenship to the unfit in the United States became the model for the Nuremburg Laws of 1935. Stefan Kühl (1994) was the first to document in detail the American–German eugenics connection. In Hitler’s American Model (2017), James Whitman extended this research to illustrate how U.S. policies influenced Nazi race law in the 1930s and the Nuremberg Laws in particular. The Big Lie: Exposing the Nazi Roots of the American Left (2017) by Dinesh D’Souza is the most recent effort to bring public attention to eugenics and the American–German connection.

The widespread acceptance of eugenics in the United States, especially by progressives, is a troubling part of U.S. history unknown to many Americans, and the role model America provided for Nazi race law is even more troubling. The conventional wisdom in the United States places blame for scientific racism on Germany, but the opposite is an inconvenient truth that continues to receive little public attention. The fall of the Third Reich revealed the logical outcome of eugenics. Eugenics disappeared almost overnight from public discourse and became an embarrassment to many who had supported it and its policy implications.

I have covered eugenics and related topics in my lectures on the history of economic ideas for many years and have been surprised at two reactions from students: first, many students find eugenics and related topics the most interesting part of the course, and, second, with only a few exceptions the students have never heard of eugenics in the United States and, especially, its relationship to Nazi Germany. This lack of awareness suggests a question and the catalyst for this paper: To what degree are high school students exposed to the history of eugenics?

One would expect that with the current political focus on discrimination and racism, eugenics would be an important topic in U.S. history and related courses at the high school level. Unfortunately, this is not the case. As I show in this paper, high school history textbooks essentially ignore the topic. Although our high school textbooks are impressive in presentation, length, and number of topics covered, eugenics and its influence on public policy in the United States and its relationship to Nazi Germany are ignored and when mentioned are presented as an incidental part of U.S. history.

I first discuss how eugenics emerged from a combination of the political economy of population growth initiated by Thomas Malthus (1798) and subsequent developments in human biology in the second half of the nineteenth century. Next I discuss how the United States became the center of eugenic research and policy, the relationship between eugenics and the progressive movement, and the degree to which eugenics in the United States influenced Germany and the Nuremburg Laws of 1935. Then I look in particular at nine high school textbooks and other textbook materials to determine the degree to which eugenics is covered in high school. In the concluding section, I offer conjectures to account for the omission and the missed opportunities to educate students resulting from the omission.

SOURCE


0 comments

An Aboriginal death in custody

There appears to be some mystification about this death.  There should not be.  Chatfield was an Aborigine and Cutmore probably was too. And the stimulus for the seizures leading to Chatfield's death was clearly his forced separation from his cellmate.

Why should separation from his cellmate be distressing?  Because Aborigines are hugely social.  They need to be with one another. An Aborigine put into solitary confinement will do his best to commit suicide.  That need for social connectedness is not unknown among whites.  I suffer from it to some extent also.  It can be very distressing.

That it is super-strong among Aborigines is demonstrated in the way Aborigines can be "Sung" to death by their tribe.  A tribal Aborigine who breaches an important tribal behaviour code will be "Sung" to death.  The singing/chanting is simply an emphasis of the fact that the offender has been excommunicated from the tribe.  It makes the excommunication final.  So the offender has no-one to whom he has a social connection.  Whatever the physical process may be, death is rapid.

One does see something similar among whites. A mother who suicides after the death of her child, for instance, will sometimes be referred to as having died of a "broken heart"

Chatfield and Cutmore would have been housed together because prison authorities know that housing Aborigines together reduces problems with them.  But in so doing they caused the usual Aborigine grabbing for affiliation to take place.  The two became "Mates" in a very strong sense.

So separating them led to the perfectly normal result among Aborigines:  Deep distress leading to death.


An inquest into the death of an Indigenous man in NSW custody has heard the young father may have had multiple seizures and was distressed to be separated from his cellmate on his last night in remand.

Tane Chatfield died in September 2017 after being held on remand at Tamworth Correctional Centre for two years. The 22-year-old attended court in Armidale but was returned to Tamworth after the first day of a hearing on 19 September.

Darren Brian Cutmore had been Chatfield’s cellmate in the preceding days, but was moved to a different cell that night as the pair were co-accused.

Cutmore told deputy state coroner Harriet Grahame that Chatfield was on the way back to Tamworth from court “happy as can be” as he was confident of being acquitted.

But Chatfield’s former cellmate, who considered himself an “older brother” to the 22-year-old, could still remember his reaction when he realised the pair were to be separated later that night.

“He was very upset ... he said ‘all we’ve got is each other and now they’ve fucking taken that away from us too’,” Cutmore told the inquest on Monday.

Cutmore also said that while the pair had often used drugs in prison together, he did not think Chatfield did so on the night of 19 September.

The man who replaced Cutmore in Chatfield’s cell, Barry Evans, told the inquest the deceased appeared “agitated” following his separation from Cutmore but he made his new cellmate feel “welcome and comfortable”.

Evans, who only met Chatfield that day, said he did not see his cellmate use drugs or hear him talk about doing so.

The former firefighter said he called for help after seeing Chatfield hit the floor. “It was like he was having a fit,” Evans told the inquest.

One of the officers at Tamworth Correctional Centre that day, David Mezanaric, told the inquest he knew of the victim having two seizures on 19 September – one in his cell and one in a treatment room before paramedics arrived.

The victim’s mother, Nioka Chatfield, said the grief she felt after the death of her son “became like a chronic illness” and her family needed accountability to move forward.

“I can’t tell you how my boy lost life ... there are lots of unanswered questions,” Chatfield said after the first day of the inquest.

“I’m only concentrating on the love that will never change for my boy. The boy who I saw smiling down at me when I was tying his laces ... the teenager I saw playing football, and the young 22-year-old who lost his life in custody.”

NSW Corrective Services at the time said Chatfield’s death was not suspicious, telling his family he took his own life.

Chatfield died after two days at Tamworth Base Hospital on 22 September 2017.

SOURCE 


0 comments


"Kimberley is our land and we want the right to work it"

So says an Aboriginal leader.

Aborigines have NO legal right to the many tracts of land they claim as theirs.  Various governments have GIVEN them title to some tracts in the hope that their claims on land will be satisfied by that.  But that is a joke of course.  "Give them an inch and they will take a mile" applies.  Nothing will ever be enough

So the sob story below is yet another grab for land.  They want to take over a productive station.

A line has to be drawn somewhere and it could surely be drawn in accordance with the best use of the land.  Farms and stations given to Aborigines in the past have been shockingly misused, going back to the Lake Tyers disaster.  Basically, what aborigines do is eat all the cattle and let the buildings go to rack and ruin.  A productive tract of land becomes a wilderness.

Greenies no doubt think that is a good thing but what might the heavily taxed average citizen think of all that waste?


Last year, when Kimberley traditional owners bought Myroodah Station off the Indigenous Land Corporation, I was elated and deeply moved. Many generations of our families had worked on Myroodah for white bosses — some were paid, some were slaves.

Quanbun and Jubilee stations are located on my country, on the mighty Martuwarra, the Fitzroy River, the lifeblood of our country and connected to Yi-martuwarra Ngurrara people; it was made when the world was still soft in the Dreamtime.

When the Aboriginal-owned Kimberley Agriculture and Pastoral Company purchased Myroodah, I thought that we had reached a turning point where Kimberley traditional owners were shaping our own destiny, closing the gap through creating our own economic development opportunities, and stepping up to manage and set the strategic direction for the pastoral stations our families once laboured on.

Last week has seen a terrible knockback for our people, with the purchase of Jubilee Downs cattle station, which contains the Quanbun Station lease, by mining and pastoral magnate Andrew “Twiggy” Forrest. Traditional owners — represented by KAPCO, Yanunijarra Aboriginal Corporation and the Nature Conservancy — put in a $25m bid to buy one of these stations. It wasn’t enough. It wasn’t enough to even get a foot in the door so we could negotiate further, bump up our bid. For us, this isn’t just an acquisition, just about money, just another asset to add to our portfolio. This is our country. We are trying to buy back our own country.

According to the ABC, Forrest spoke of “job creation for local communities”. Job creation isn’t sufficient. We do not wish to work for white bosses, like our mothers and grandmothers and great grandmothers did. We wish to work for ourselves, under our own leadership, on our own traditional lands.

He spoke too, of continuing the legacy of the previous owners; that is, regenerative land management and a herd of quality cattle. What he didn’t mention, was the darker legacy — a legacy of trauma and dispossession still felt by Yi-martuwarra Ngurrara people, whose lands these stations occupy, today. Take Quanbun, for example. In the 1905 Royal Commission on the Condition of the Natives, commonly known as the Roth Report, evidence was heard that the white boss of Quanbun had an Aboriginal woman he kept, and the overseer had from eight to 10 Aboriginal women to choose from. History tells us that in many cases, on many stations, these women were stolen from their husbands and raped. In the case of the evidence gathered in the Roth Report, the women on Quanbun were whipped at night if they allowed the sheep to stray.

This is the legacy of white pastoralists we remember. And while it is, of course, true that Forrest has been generous to indigenous Australians and cannot be held accountable for the sins of past white men, whether his family or otherwise, it’s also true that Forrest’s family has a long and storied history in the Kimberley. His great, great uncle was Alexander Forrest, an explorer and politician, credited with opening up the Kimberley region for pastoral activity. Alexander had significant pastoral interests in the Kimberley, including ownership of Yeeda Station, where my great, great grandmother worked. In 1893, Alexander Forrest asked whether “the life of one European is not worth a thousand natives, as far as settlement of this country is concerned”. Within the context of the Black Lives Matter movement, within the context of the sale of this station to one “European”, one white owner, instead of to a collective worth “a thousand natives”, we’re asking ourselves, in almost 150 years, how much has really changed? Will the ill-starred history of Kimberley traditional owners continue repeating on us in terms of the ownership of our land?

Most critical is the position of these two stations on the Martuwarra, the Fitzroy River. The river is the lifeblood of Yi-martuwarra Ngurrara and Nykina country. Forrest has said that when it comes to plans for the properties nothing is off the table. This worries me, as large-scale irrigation projects have been floated by Gina Rinehart, and would threaten cultural sites, as well as barramundi, gummy sharks, sawfish and stingrays — a whole ecosystem. Our lifeblood. In the wake of this news, I’m especially disappointed for Kimberley traditional owners.

This is really not about Andrew Forrest. This is about justice for our people and getting our land back. He should relinquish the bid, right the wrongs of the past, and allow traditional owners to buy back our own lands.

SOURCE 


0 comments

How has America got policing so wrong? 25 US police chiefs toured Scotland. What they saw left them visibly changed

This article promises more than it delivers.  The lesson learnt from Scotland seems to be more emphasis on de-escalation techniques. But such techniques are already a big part of police training.  Still, seeing examples of de-escalation working was probably beneficial.

What the article glides over is that America is an armed society whereas Scotland is largely a disarmed society. So the risk of an officer being shot is very different -- leading to much more caution in the USA.  An American cop can often not afford to give a villain a break


It is a late Saturday afternoon in Washington, DC, and Chuck Wexler, one of America’s leading police reform strategists, is at his office desk, a cotton scarf bunched around his neck ready for quick deployment as a face mask, a sign of these strange times.

The streets of America’s major cities have been awash with Black Lives Matter protesters for a fortnight. Incredibly, news has just broken that another unarmed black man has been shot in Atlanta and the city’s police chief will soon be forced to resign.

A former right-hand man to Boston’s police commissioner, Wexler has led the country’s foremost crime strategy think tank, the Police Executive Research Forum, for 26 years. He is one of the architects of the city’s Community Disorders Unit, known nationally for successfully prosecuting and preventing racially motivated crime, and is the man America’s top police chiefs call for advice when facing complex or volatile situations in their jurisdictions.

Wexler says that his experience working with police in England, Scotland and Ireland, where police are not routinely armed, was instrumental in his thinking — and continues to help him push for change in the attitudes of the US police hierarchy.

“It was something of an epiphany for me, around the time that the Ferguson (police shooting in St Louis) incident happened in 2014 … I was at a police recruit graduation ceremony in Scotland and I asked a young constable how he’d handle someone with a knife and not having a gun. There was a knife epidemic in Scotland at the time. He said, ‘No problem … I have my baton, my spray … first, I would step back.’ I thought how is it that when police handle it one way, someone dies, and in another place where they handle it differently they live.

“We knew from our studies that 40 per cent of the fatal officer-involved shootings in the US involved persons with knives, rocks, bricks — not guns. I went back to DC excited, full of ideas of how we could train our officers differently, emulate the UK models and see if we could reduce the 400 or so deaths that The Washington Post had identified could be prevented each year. Perhaps I was being naive, but nobody paid any attention. Then I had another idea: I thought I’ll show them first-hand.”

Wexler ended up inviting 25 of the top police chiefs in the US to come to Scotland with him to try to show them how officers in other countries were doing things differently. He insisted they pay their own way and made clear “there were no hotels or fancy food and they’d sleep in police barracks”. He says he watched attitudes visibly change during the trip, learning also that exposing the leadership group to each other led them see that they were already doing significant work individually in their jurisdictions but simply didn’t know it.

For example, he says, SWAT teams from Houston, Texas, already were working on “slow down” protocols, using time and distance to de-escalate. In New York, $21m had been deployed to retrain officers: “If I told them about Scotland, they were dismissive, but when we could show them other big US forces were responding and changing, they thought: ‘Well, we can learn from them.’ ”

Also accompanying Wexler to Scotland was American philanthropist Howard Buffett, son of respected investor Warren Buffett. Howard Buffett immediately took an interest in police de-escalation practices and, with the support of his foundation, PERF turned its guiding principles into a police training program called ICAT: Integrating Communications, Assessment and Tactics.

“Chuck Wexler pushes the limits so others can see the benefits of change,” Buffett told Inquirer.

Wexler has since led several major projects including a new strategy to encourage police to deal with the opiate epidemic in the US as a health rather than law enforcement issue.

PERF has campaigned to encourage all US police forces to adopt body cameras and has written new guidelines for police handling of sexual assault allegations along with a slew of recommendations and strategies aimed at fostering police-community trust. Last year, PERF developed a new protocol to help train officers to identify and defuse the toxic epidemic of “suicide by cop” situations in which people, often mentally ill and affected by drugs or alcohol, create violent stand-offs that lead to their death at the hands of the law.

Recent calls to ‘‘de-fund’’ the police, he says, are a reaction to the anger many citizens feel about the use of excessive force. PERF supports and has long advocated efforts aimed at reorganising resources, perhaps creat­ing different networks of first-responder teams to triage emergencies. In some cases, this has meant turning to mental health and drug and alcohol crisis teams first rather than police. But, Wexler says, police reform needs investment to accomplish real change.

In Camden, New Jersey, where 40 per cent of residents fall below the poverty line, officials with the help of PERF and its leadership disbanded its police department and replaced it with one under county control, guided by progressive policing techniques and leadership. This has resulted in a reduction in violent crime, and police were photographed marching alongside protesters in the wake of Floyd’s death.

Wexler says one of his team’s most difficult jobs is the constant review of body cam and citizen phone footage of violent incidents involving police. But the team uses the videos as teaching lessons in its ICAT program.

“The most awful part is to see someone’s home, to see their child there on the second floor, who has not taken their medication and is standing there with a knife and the officer is trained to issue orders and then, if necessary, use deadly force,” he says.

“Once again, I’m reminded of while I was in Scotland, when a constable asked me why a US officer had said that the most important duty he had was to ‘get his officers home safe at night’. She said to me: ‘Why do they say that? We would never say that. For us it is about getting everyone home safe at night. It’s a human right.’ ”

Floyd’s death, says Wexler, is a watershed moment and he wants to ensure that the momentum for change in US policing is not lost in the wake of the chaos and suffering created by the COVID-19 crisis.

SOURCE  

1 comments


Inside medicine’s culture of racism, bullying and harassment

I have no doubt that the instances described below did happen.  What I doubt is that they are common.  The medical profession encounters many of the hard edges of human society so is less idealistic.  As a result they can be cynical and reserved in their approach to others.

I see something of that when I meet a medical practitioner who is new to me.  When they hear that I am a retired university lecturer, their attitude to me visibly warms.  I become one of them rather than someone who has to be approached with caution. And I do generally get on well with doctors.

So I can see that doctors have been hardened by experience and that might make them unsympathetic or abrupt on occasions.  But does that do much harm?  One would think that Asian students might be treated unkindly and I believe that they are on occasions.  But the large numbers of Asian doctors I encounter one way or another tells me that they are pretty good at surviving any such travails.  The large number of female doctors tells a similar story

And the assumption that receivers of donor sperm usually prefer Caucasians as the donors is not ignorant. It is simply wrong.  The fact is that Caucasian types are overwhelmingly preferred by recipients.  England gets a high proportion of its donated sperm from Denmark, where blue eyes and blond hair are common.  The Viking invasion is not over!

So the claim that medicine has a culture of racism, bullying and harassment surely has  something to it but not much


Being told indirectly that, unless you’re a white man, no one is going to want your sperm is not something you forget.

But medical students say racist slurs, social exclusion, gender discrimination and inappropriate jibes from their superiors are a common experience and it highlights the need for urgent changes in the industry.

Sam, a fifth-year medical student who is a person of colour, says bullying is “endemic” in medicine, especially if you are not white.

He has been subject to a number of slurs, including one incident a few weeks ago involving a midwife in the IVF ward of a Sydney hospital.

The student was in the room when a group of nurses were discussing a female patient who had requested an Asian sperm donor. “(The midwife) said, ‘I don’t understand why you wouldn’t want to use caucasian sperm’,” Sam explained.

And Sam’s not alone. Many of his peers have also endured deeply unpleasant experiences.

Another fifth-year student, Tim*, said he benefited from being a white man in the medical industry and wanted to do more to help his international colleagues.

“It’s difficult to report because a lot of this stuff toes the line. It’s not like someone has slapped you across the face; it’s usually much less obvious,” Tim said.

One example he gave involved a teacher who was very particular about students arriving to class on time, and wouldn’t let them in if they were late.

“One day I arrived a few minutes late and he said, ‘Don’t worry, come in and sit down.’ But a student from an Indian background arrived straight after me and he wouldn’t let him in,” Tim explained.

“Then I noticed it was a repetitive thing. He’d let the caucasian students in but not the international students. It’s just not good enough.”

From belittling, to sexist comments and favouring male colleagues, sexism in medicine has also been allowed to flourish.

One female medical students told NCA NewsWire she was placed in a male-dominated team that made jokes about women being in surgery.

“They would say, ‘Why are you here? You need a family-friendly career,’” the student said.

“I couldn’t report it because I was the only female student in there and it would have been obvious that it was me.”

A second female student said while her experiences had been good, everyone assumed she was a nurse, not a doctor.

“Most of my teachers always refer to doctors being a ‘he’ and nurses being a ‘she’,” the student explained.

Sam supported those comments saying when he entered a theatre no one asked any questions, but when females do they were queried.

All four students described being ignored or hounded in front of patients or fellow staff.

When Tim spent time as part of a neurosurgery team, he should have done ward rounds and accompanied seniors into surgery. Instead, he was ignored.

“When they found out I was a student and not doctor, they wouldn’t even acknowledge me or say hello. This continued the entire time,” he said.

“For the majority of that term, it wasn’t what they were saying; it was them not saying anything.”

And when they were speaking, they often spent it belittling the Sydney student.

He said things escalated when he noticed a patient wasn’t responding to questions and failed to open her eyes, or move her hands.

“I thought, ‘this could be life-threatening’ so I said to the doctor, ‘Shouldn’t we do something? She doesn’t look good.’ But in front of everyone, they would be really dismissive and start asking things like, ‘What do you think is wrong with her? What should you do?’” he said.

“That patient was quite ill and no one was doing something about it.”

While not all doctors gave students a rough time, many have experienced verbal abuse, social exclusion, racial discrimination, gender stereotyping and general rudeness, usually from surgeons and physicians.

A report, published by BMC Medical Education and driven by fifth year UNSW Medicine student Laura Colenbrander, found in the past year alone Bankstown-Lidcombe, St George, Royal Prince Alfred, Westmead and Tamworth hospitals had all made headlines regarding mistreatment of junior doctors.

The hierarchical structure of medicine fuelled the “endemic culture” of bullying and harassment, often perpetrated by senior staff, Ms Colenbrander’s study found.

All four students said the hierarchy created barriers to reporting mistreatment, as they feared they would be labelled a troublemaker.

Students were also concerned it would affect career progression or that reporting avenues did not guarantee confidentiality or an outcome.

“Senior doctors were overwhelmingly considered unapproachable because they were ‘self-important’, sexist, uninterested, too busy, or participants feared verbal abuse,” the report states.

Australian Medical Students Association president Daniel Zou said the reporting processes for bullying and harassment remained unclear to many medical students.

“There should be confidential, easily accessible, clearly communicated and consistent reporting pathways available for all medical students,” he told NCA NewsWire.

“In many hospitals and medical schools, there are no guaranteed confidential reporting processes or anonymous reporting processes. For those hospitals and medical schools that do, they are oftentimes confusing pathways, inaccessible and ineffectual.”

Tim argued the industry had a responsibility to teach students about what bullying and harassment was.

“There are a lot of things we didn’t realise were serious,” he said. “And a lot of medical students won’t report it because we know nothing will happen. It’s not a big enough issue to bring up with top-level hospital management.”

Of the four study participants in Ms Colenbrander’s research who had reported an incident or knew someone who had, none had experienced desired outcomes.

This included sexist behaviour from surgeons on which the clinical school had insufficient authority to act.

This harassment extends beyond students. In 2015, the Australian Medical Association (AMA) confirmed more than 50 per cent of doctors and trainees (not including medical students) had been bullied or harassed, with verbal harassment among consultants most commonly cited.

Ms Colenbrander said the issue of bullying and harassment “spoke to her” because she knew many students who had experienced this in a hospital setting. “It just seemed widespread,” Ms Colenbrander told NCA NewsWire.

“Personally my experiences have been really positive. I’ve had great teachers and experiences but I’ve also definitely experienced the underbelly of medicine.”

According to a survey released by the Medical Board of Australia, one in three trainee doctors in Australia have experienced or witnessed bullying, harassment or discrimination in the past 12 months.

However, only a third have done anything about it, with 57 per cent believing they would suffer negative consequences if they reported the inappropriate behaviour.

And mistreatment of medical students will no doubt have long-term consequences on the nation’s future doctors.

“It has an epidemic bullying culture. Medicine isn’t immune from the stuff that happens in other professions. It’s still very rife and still there,” Sam said. “These are the people that look after you, so why can’t they look after their own.”

SOURCE