Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Topics - Muhammed Rashedul Hasan

Pages: [1] 2
1
Bengal famine: Remembering WW2's forgotten disaster

In 1943, during World War Two, Bengal in British-run India was hit with a severe shortage of food.

Following the Japanese occupation of Burma, the Allied forces had halted the movement of food in the region.

No one knows how many people died but estimates range between 3m and 5m people.

Professor Rafiqul Islam was a child in Bengal at the time. He spoke to Witness about the little-known famine that claimed so many lives.

To hear from the witness click on: http://www.bbc.co.uk/programmes/p004t1hd

2
Kenya mourns victims of Garissa al-Shabab attack

Kenya has begun three days of mourning for the 148 victims of an attack on students by militant group al-Shabab.
Easter ceremonies will be held to remember those who died in Thursday's attack on Garissa University, and flags are expected to fly at half-mast.

President Uhuru Kenyatta has vowed to respond to the attack "in the severest ways possible".

Sunni Islam's most respected seat of learning, Cairo's al-Azhar University, has also condemned the attack.

The Kenyan Red Cross says that so far 54 of the victims have been identified by relatives at a morgue in the capital, Nairobi.

Buses are transporting more than 600 students and about 50 staff who survived the attacks to their hometowns.

Many survivors have been reunited with their families at Nairobi's Nyayo National Stadium which has been set up as a disaster centre.

Eighteen-year-old Lavenda Mutesi, who jumped out of her dorm room window to escape the attack, told AP: "As much as I'm grateful, I wish my friends were here, because I wish they could share this moment with me, with their parents... I lost a whole lot of friends."

Almost all of the 148 killed were students and another 79 people were injured.

Four gunmen were killed, and officials say they are holding five people for questioning - one of whom is believed to be a university security guard.

United in grief

Both Christians and Muslims have denounced the attack. On Sunday, Sunni Islam's most respected seat of learning, Cairo's al-Azhar University, said it condemned the "terrorist attack".

Pope Francis is expected to use his traditional Easter Sunday message to describe the students as contemporary Christian martyrs.

In Kenya, people took the streets to protest the killings and reject the idea that al-Shabab had succeeded in dividing the country,

"What I can say is that here in Eastleigh [a Somali and Muslim Nairobi suburb] both Christians and Muslims are doing business together. There is harmony... There is no religion that says people should kill one another," one man told the BBC.

'Defend our way of life'

On Saturday, President Kenyatta said that al-Shabab posed an "existential threat" to Kenya.

He vowed to "fight terrorism to the end" and said the militants would not succeed in their aim of creating an Islamic caliphate in Kenya.

The president's address came as the relatives of victims queued at a morgue in the capital Nairobi to identify their loved ones.

The bodies were flown to Nairobi for identification, as local mortuaries have been unable to cope, and many of the students killed came from other parts of the country.

The bodies of the four gunmen who died remain in Garissa, where they were put on public display on Saturday.

Earlier on Saturday, a 19-year-old girl was found unhurt in a cupboard on the university campus, where she had hid for two days.

There has been criticism in Garissa, which is 150km (100 miles) from the Somali border, at how the security services dealt with the attack.

Only two guards were on duty at the time of the assault, despite official warnings that an attack on an institution of higher learning was likely.

Al-Shabab, which is based in neighbouring Somalia, has pledged a "long, gruesome war" against Kenya.

The group said its attacks were in retaliation for acts by Kenya's security forces, who are part of the African Union's mission in Somalia against al-Shabab.

SOURCE: http://www.bbc.com/news/world-africa-32187891

3
Lee Kuan Yew: Lessons for leaders from Asia's 'Grand Master'
Graham Allison, Special to CNN

The death of the founding father of Singapore last Monday is an appropriate occasion to reflect on nation building.

As prime minister for its first three decades, Lee Kuan Yew raised a poor port from the bottom rungs of the third world to the first world in a single generation.

As it prepares to mark its 50th anniversary as a nation, Singapore is today an ultra-modern metropolis of almost six million people with higher per capita GDP than the United States, according to the World Bank.

Lee's achievement in building a successful nation contrasts sharply with the results of Washington's expenditure of over $4 trillion and nearly 7,000 American lives in Iraq and Afghanistan over the past decade.

Some say Singapore's story is sui generis: Something that could only happen in that time and place.

But its remarkable performance has less to do with miraculous conditions than with Lee's model of disciplined, visionary leadership.

Leaders of other aspiring-to-develop nations, and even the U.S., should take pages from Lee Kuan Yew's playbook to address current challenges.

'Grand Master's' lessons
We know many of Lee's lessons on the role of government leadership in development because my co-authors and I asked him directly two years ago to reflect on them -- points we captured in our book, Lee Kuan Yew: The Grand Master's Insights on China, the United States, and the World.

Five stand out.

First, Lee insisted that governance was first and foremost about results.

In his words, "the acid test of any legal system is not the greatness or the grandeur of its ideal concepts, but whether, in fact, it is able to produce order and justice."

About the core purposes of government, he was crystal clear. In terms America's founding fathers would recognize, he believed that "the ultimate test of the value of a political system is whether it helps that society establish conditions which improve the standard of living for the majority of its people, plus enabling the maximum of personal freedoms compatible with the freedoms of others in society."

Moral leadership
Second, superior performance requires superior leadership.

Lee demanded of leaders both intellectual and moral superiority. Contrary to modern Western democratic theory that emphasizes citizens' participation in governance, his views were closer to Plato's conception of the "guardians," or China's historical Mandarins.

Good government requires most of all leaders who put the public good unquestionably above their own personal interests.

He was disappointed by many of his counterparts who failed that test.

Equal opportunity
Third, successful societies guarantee strict equality of opportunity for all individuals, but are realistic about the fact that this will yield substantial inequalities in outcomes.

For Lee, the essence of a successful society was intense competition on a level playing field that allows each individual to achieve his or her maximum.

Few things offended him more than denial of equality of opportunity on the basis of caste (India), class (Europe), race (the U.S. during segregation), sex, or other irrelevant attributes.

As he put it, the leader's objective was to "build up a society in which people will be rewarded not according to the amount of property they own, but according to their active contribution to society in physical or mental labor."

Discipline, not democracy
Fourth, about democracy, particularly Western liberal democracy, Lee had serious reservations.

In part, this attitude stemmed from his own experience, but it also reflected a deeper philosophical aversion to ideologies.

As he liked to say, "the acid test is performance, not promises.

The millions dispossessed in Asia care not and know not of theory. They want a better life. They want a more equal, just society."

Lee enjoyed engaging American critics who insisted that without democracy Singapore could not develop an advanced economy.

In contrast, he argued that what most countries needed was more "discipline," rather than democracy.

He noted that the U.S. had been building democracy and giving aid to the Philippines for over a century.

But, he asked, how many people from Singapore sought to leave it for the Philippines?

Many people in the Philippines, he noted, wanted to move to Singapore.

On one occasion, with a broad smile, he continued, "and you will notice that since the Vietnam War and the Great Society, the U.S. system has not functioned even for the United States."

Stability and strength
Fifth, which leaders did he most admire? From the recent past, he focused on three: Charles de Gaulle, Deng Xiaoping, and Winston Churchill.

"De Gaulle, because he had tremendous guts; Deng, because he changed China from a broken-backed state, which would have imploded like the Soviet Union, into what it is today; and "Churchill, because any other person would have given up."

On the current scene, the leader who impressed him most was the new president of China, Xi Jinping.

As he said just before Xi took office: "I would put him in Nelson Mandela's class of persons. A person with enormous emotional stability who does not allow his personal misfortunes or sufferings to affect his judgment. In a word, he is impressive."

As China's leaders attempt to follow in Lee's footsteps in building a Mandarin-Leninist led nation that overtook the U.S. last year in GDP (measured by PPP) to become the world's largest economy, and democratic India seems poised to grow at rates that will compete with China, we can reflect on lessons from Lee Kuan Yew and place our bets.

Governing a nation in which two of every three citizens believe their country is headed in the wrong direction -- and have believed so under Democratic and Republican Presidents for all of the 21st century -- American leaders should ask whether it is time to focus on the acid test of performance rather than the litmus test of ideology.

SOURCE: http://edition.cnn.com/2015/03/28/opinions/singapore-lee-kuan-yew-graham-allison/index.html


4
Lee Kuan Yew: How did Asia remember him?
By Jonathan Head
BBC South East Asia correspondent

"Anyone who thinks he is a statesman ought to see a psychiatrist."

That comment, from Lee Kuan Yew, the most quotable of Asian leaders, must have been made with his tongue at least partly in his cheek. His exceptionally long tenure on the diplomatic stage, his brilliant intellect and ruthless pragmatism earned Lee the accolade of "statesman" from more world leaders than any other personality in the Asia Pacific region.

But what about closer to home, in South East Asia? There, Lee Kuan Yew's image is more complex.

"Some countries are born independent. Some achieve independence. Singapore had independence thrust upon it," he wrote in 1998. When Singapore was expelled from Malaysia in 1965, Lee Kuan Yew was pessimistic about its prospects.

He was acutely conscious of its vulnerability, a small, largely Chinese island-state sandwiched between two much larger Muslim countries, Malaysia and Indonesia, both of which were hostile. He was also worried about the consequences of American reverses in Vietnam.

American ally
A self-professed Machiavellian, Lee believed raw power determined the fate of nations, and Singapore had little.

He wanted the might of the US to anchor his country, but expected its Soviet rival to challenge this in Asia. Early on, he anticipated the rise of China, which, he believed, would inevitably view South East Asia as its own backyard.

So the formation of Asean (the Association of South East Asian Nations) in 1967, at the prompting of the then-Thai and Indonesian foreign ministers, was initially regarded with some scepticism by the Singaporean leader, although he also saw that it was essential his country play a leading role, in the hope of advancing Singapore's acceptance as an equal player in the region.

Lingering disputes over territorial waters and many other issues continued to dog relations with Indonesia and Malaysia for several years. There were also differences between Singapore and its neighbours over a Malaysian proposal to establish a zone of neutrality in Asean, which would have required an end to all foreign bases. Lee wished to preserve his country's close military ties with the US.

Pragmatic interlocutor
But with his first visit to Indonesia under President Suharto in 1973, Lee Kuan Yew showed one of his abilities to great effect: to build strong working relationships with other South East Asian leaders. He was quick to understand the enigmatic Indonesian general, and establish a pragmatic, trusting rapport with him, which lasted until Suharto's death in 2008.

He was later able to form a similar, though less warm and trusting, relationship with long-standing Malaysian Prime Minister Mahathir Mohammad, whom he recognised as a man who, for all of his animosity towards the Chinese of Singapore, shared Lee's ambition to move his country forward.

They also both rejected Western criticism of their approach to human rights in the 1990s, leading to the vaguely articulated notion of "Asian values", which prioritised stability and economic progress over individual freedoms.

His other great contribution to Asean was as interlocutor with the rest of the world, in particular the US and China.

Regional spokesperson
His clear-headed strategic views influenced a succession of US presidents and officials, in particular the Cold War Secretaries of State Henry Kissinger and George Shultz. He made a ground-breaking visit to China in 1976, when the rest of Asean was still deeply suspicious of Beijing's role in sponsoring insurgencies, and received Deng Xiaoping in Singapore two years later, as he led his country out of its international isolation in the late 1970s.

The new Chinese leader was fascinated by Singapore's blend of authoritarian rule and entrepreneurial success, and Lee used his ties with China to persuade it to take a more conciliatory approach towards South East Asia. More than anyone else, he was able to articulate Asean's concerns to the great powers - in his view, "to ensure its interests were taken into account".

As Jusuf Wanandi, one of the architects of Suharto's pragmatic New Order wrote following the Singaporean's death: "Lee, with his sharp thinking, especially on the future of East Asia and Asia Pacific, had become the spokesperson for the region, in particular to the West, and that was indeed an important role for him to play. And regarding the future strategic development of the region, no one can replace him."

Free-trade advocate
It was however the US defeat in Vietnam in 1975, and then the Vietnamese invasion of Cambodia in late 1978, that compelled Asean to elevate itself to more than just a talking-shop for easing internal disputes and building trust, which is largely what it was until 1975. Lee Kuan Yew now played a central role in constructing a more robust Asean architecture.

First, because of his conviction that political stability came out of economic progress, he took every opportunity to promote trade within Asean, raising the idea of a free trade area as early as 1973.

This suited Singapore, which had pinned its survival on having one of the world's freest trade and investment regimes, but it proved far more difficult to overcome the suspicion and vested interests of neighbouring countries. The Asean Free Trade Area finally came into effect only in 1994.

Second, with his conviction that Asean could never be militarily strong or cohesive enough to provide for its own security unaided, he pushed for a framework that would keep the superpowers engaged and in dialogue with the region. This eventually gave rise in the 1990s to the Asean Regional Forum, a unique, annual gathering that brings together the foreign ministers of China, the United States, Russia, Japan, the European Union, North and South Korea, among others, to hammer out international security issues.

"You cannot replace the reality of power by just talk," he said in 1993. "You may diminish suspicions and fears - and that is a very great achievement."

Outsize role
Blunt-spoken and stubborn in his convictions, Lee was not always able to overcome differences with his Asean partners. Indonesia and Malaysia were uneasy about his rapprochement with China, although he did give a private assurance that Singapore would not normalise relations with Beijing until Indonesia did, and honoured that promise.

Likewise, he worried about those two countries' approaches to Vietnam during the stand-off over Cambodia in the 1980s, fearing that the Soviet-backed Vietnamese would undermine Asean unity. He harboured no long-term enmity towards Communist Vietnam, he said, but stuck to the principle that invading other countries was unacceptable. The Vietnamese seemed to accept that explanation, honouring Lee with the role of an official economic advisor in 1992 after the Cambodia issue was settled.

In later years some of Lee's outspoken comments about his neighbours continued to cause friction. Relations with Indonesian Presidents BJ Habibie and Abdurrahman Wahid were frosty after he cast doubt on their leadership abilities. His continued criticism of the pro-Malay Bumiputera policy in Malaysia sparked a war of words with Dr Mahathir. He viewed Thailand as a capricious and unreliable partner.
As he often said, he cared little whether he was liked or not.

But the eulogies to Lee Kuan Yew from neighbouring countries are not insincere. He played an outsize role in building the stability and prosperity of this region, and will long be remembered for that.

SOURCE: http://www.bbc.com/news/world-asia-32046144

5
Lee Kuan Yew: Singapore holds funeral procession

Singapore is bidding farewell to its founding Prime Minister Lee Kuan Yew, who died on Monday aged 91.

Despite torrential rain, thousands lined the streets to view the funeral procession carrying Mr Lee's coffin from parliament, where it has been lying in state, across the city.

A state funeral attended by world leaders is now taking place, ahead of a private family cremation ceremony.
One million people have visited tribute sites this week, say local media.

More than half a million people - 12% of Singaporean citizens - visited Parliament House to see Mr Lee's coffin, while at least 850,000 others went to community sites to pay tribute.

In his eulogy, Mr Lee's son and the current Prime Minister Lee Hsein Loong, said his father had "lived and breathed Singapore all his life".
"The light that has guided us all these years has been extinguished," he said.

The funeral procession began on Sunday at 12:30 (04:30 GMT) as Mr Lee's body was taken from Parliament House on a gun carriage.
A 21-gun salute sounded, echoing across the city, as the procession moved on into the business district and Tanjong Pagar, the docklands constituency Mr Lee represented for his whole political life.

Military jets flew overhead while two Singaporean navy vessels conducted a sail-past of the Marina Bay barrage - the massive water conservation project spearheaded by Mr Lee.

The country will observe a minute's silence in the afternoon before singing the national anthem. The private cremation is taking place at the Mandai crematorium.

SOURCE: http://www.bbc.com/news/world-asia-32102686

6
Remembering Howard Zinn
Noam Chomsky
Resist Newsletter, March/April 2010

It is not easy for me to write a few words about Howard Zinn, the great American activist and historian who passed away a few days ago. He was a very close friend for 45 years. The families were very close too. His wife Roz, who died of cancer not long before, was also a marvelous person and close friend. Also somber is the realization that a whole generation seems to be disappearing, including several other old friends: Edward Said, Eqbal Ahmed, and others, who were not only astute and productive scholars but also dedicated and courageous militants, always on call when needed -- which was constant. A combination that is essential if there is to be hope of decent survival.
Howard's remarkable life and work are summarized best in his own words. His primary concern, he explained, was "the countless small actions of unknown people" that lie at the roots of "those great moments" that enter the historical record -- a record that will be profoundly misleading, and seriously disempowering, if it is torn from these roots as it passes through the filters of doctrine and dogma. His life was always closely intertwined with his writings and innumerable talks and interviews. It was devoted, selflessly, to empowerment of the unknown people who brought about great moments. That was true when he was an industrial worker and labor activist, and from the days, 50 years ago, when he was teaching at Spellman college in Atlanta Georgia, a black college that was open mostly to the small black elite.

While teaching at Spellman, Howard supported the students who were at the cutting edge of the civil rights movement in its early and most dangerous days, many of whom became quite well-known in later years -- Alice Walker, Julian Bond, and others -- and who loved and revered him, as did everyone who knew him well. And as always, he did not just support them, which was rare enough, but also participated directly with them in their most hazardous efforts -- no easy undertaking at that time, before there was any organized popular movement and in the face of government hostility that lasted for some years. Finally, popular support was ignited, in large part by the courageous actions of the young people who were sitting in at lunch counters, riding freedom buses, organizing demonstrations, facing bitter racism and brutality, sometimes death. By the early 1960s a mass popular movement was taking shape, by then with Martin Luther King in a leadership role, and the government had to respond. As a reward for his courage and honesty, Howard was soon expelled from the college where he taught. A few years later he wrote the standard work on SNCC (the Student non-violent Coordinating Committee), the major organization of those "unknown people" whose "countless small actions" played such an important part in creating the groundswell that enabled King to gain significant influence, as I am sure he would have been the first to say, and to bring the country to honor the constitutional amendments of a century earlier that had theoretically granted elementary civil rights to former slaves -- at least to do so partially; no need to stress that there remains a long way to go.

On a personal note, I came to know Howard well when we went together to a civil rights demonstration in Jackson Mississippi in (I think) 1964, even at that late date a scene of violent public antagonism, police brutality, and indifference or even cooperation with state security forces on the part of federal authorities, sometimes in ways that were quite shocking.

After being expelled from the Atlanta college where he taught, Howard came to Boston, and spent the rest of his academic career at Boston University, where he was, I am sure, the most admired and loved faculty member on campus, and the target of bitter antagonism and petty cruelty on the part of the administration -- though in later years, after his retirement, he gained the public honor and respect that was always overwhelming among students, staff, much of the faculty, and the general community. While there, Howard wrote the books that brought him well-deserved fame. His book Logic of Withdrawal, in 1967, was the first to express clearly and powerfully what many were then beginning barely to contemplate: that the US had no right even to call for a negotiated settlement in Vietnam, leaving Washington with power and substantial control in the country it had invaded and by then already largely destroyed. Rather, the US should do what any aggressor should: withdraw, allow the population to somehow reconstruct as they could from the wreckage, and if minimal honesty could be attained, pay massive reparations for the crimes that the invading armies had committed, vast crimes in this case. The book had wide influence among the public, although to this day its message can barely even be comprehended in elite educated circles, an indication of how much necessary work lies ahead.

Significantly, among the general public by the war's end, 70% regarded the war as "fundamentally wrong and immoral," not "a mistake," a remarkable figure considering the fact that scarcely a hint of such a thought was expressible in mainstream opinion. Howard's writings -- and, as always, his prominent presence in protest and direct resistance -- were a major factor in civilizing much of the country.

In those same years, Howard also became one of the most prominent supporters of the resistance movement that was then developing. He was one of the early signers of the Call to Resist Illegitimate Authority and was so close to the activities of Resist that he was practically one of the organizers. He also took part at once in the sanctuary actions that had a remarkable impact in galvanizing antiwar protest. Whatever was needed -- talks, participation in civil disobedience, support for resisters, testimony at trials -- Howard was always there.

Even more influential in the long run than Howard's anti-war writings and actions was his enduring masterpiece, A People's History of the United States, a book that literally changed the consciousness of a generation. Here he developed with care, lucidity, and comprehensive sweep his fundamental message about the crucial role of the people who remain unknown in carrying forward the endless struggle for peace and justice, and about the victims of the systems of power that create their own versions of history and seek to impose it. Later, his "Voices" from the People's History, now an acclaimed theatrical and television production, has brought to many the actual words of those forgotten or ignored people who have played such a valuable role in creating a better world.

Howard's unique success in drawing the actions and voices of unknown people from the depths to which they had largely been consigned has spawned extensive historical research following a similar path, focusing on critical periods of American history, and turning to the record in other countries as well, a very welcome development. It is not entirely novel -- there had been scholarly inquiries of particular topics before -- but nothing to compare with Howard's broad and incisive evocation of "history from below," compensating for critical omissions in how American history had been interpreted and conveyed.

Howard's dedicated activism continued, literally without a break, until the very end, even in his last years, when he was suffering from severe infirmity and personal loss, though one would hardly know it when meeting him or watching him speaking tirelessly to captivated audiences all over the country. Whenever there was a struggle for peace and justice, Howard was there, on the front lines, unflagging in his enthusiasm, and inspiring in his integrity, engagement, eloquence and insight, light touch of humor in the face of adversity, dedication to non-violence, and sheer decency. It is hard even to imagine how many young people's lives were touched, and how deeply, by his achievements, both in his work and his life.

There are places where Howard's life and work should have particular resonance. One, which should be much better known, is Turkey. I know of no other country where leading writers, artists, journalists, academics and other intellectuals have compiled such an impressive record of bravery and integrity in condemning crimes of state, and going beyond to engage in civil disobedience to try to bring oppression and violence to an end, facing and sometimes enduring severe repression, and then returning to the task. It is an honorable record, unique to my knowledge, a record of which the country should be proud. And one that should be a model for others, just as Howard Zinn's life and work are an unforgettable model, sure to leave a permanent stamp on how history is understood and how a decent and honorable life should be lived.

SOURCE: http://www.chomsky.info/articles/201002--.htm

7
Cancer: The mysterious miracle cases inspiring doctors
David Robson

A few patients have made rare and unexpected recoveries leaving doctors scratching their heads, says David Robson. Can these cases provide vital clues for tackling cancer?

It was a case that baffled everyone involved. The 74-year-old woman had initially been troubled by a rash that wouldn’t go away. By the time she arrived at the hospital, her lower right leg was covered in waxy lumps, eruptions of angry red and livid purple. Tests confirmed the worst suspicions: it was carcinoma, a form of skin cancer.

The future looked bleak. Given the spread of the tumours, radiotherapy would not have been effective; nor could the doctors dig the tumours from the skin. Amputation was perhaps the best option, says Alan Irvine, the patient’s doctor at St James’ Hospital, Dublin – but at her age, she was unlikely to adapt well to a prosthetic limb. After a long and frank discussion, they decided to wait as they weighed up the options. “We had a lot of agonising for what to do,” says Irvine.

Then the “miracle” started. Despite receiving no treatment at all, the tumours were shrinking and shrivelling before their eyes. “We watched for a period of a few months and the tumours just disappeared,” says Irvine. After 20 weeks, the patient was cancer-free. “There had been no doubt about her diagnosis,” he says. “But now there was nothing in the biopsies, or the scans.”

Somehow, she had healed herself of arguably our most feared disease. “Everyone was thrilled, and a bit puzzled,” Irvine says, with some understatement. “It shows that it is possible for the body to clear cancer – even if it is incredibly rare.”

The question is, how? Irvine’s patient believed it was the hand of God; she had kissed a religious relic just before the healing set in. But scientists are instead looking to the underlying biology of so-called “spontaneous regression” to hunt for clues that could make these rare cases of self-healing more common. “If you can train the body to do this on a broader scale, you could have something that’s very widely applicable,” says Irvine.

In theory, our immune system should hunt out and destroy mutated cells before they ever develop into cancer. Occasionally, however, these cells manage to sneak under the radar, reproducing until they grow into a full-blown tumour.

By the time the cancer has reached the attention of doctors, unaided recovery is highly unlikely: overall, just one in 100,000 cancer patients are thought to shed the disease without treatment.

Disappearing act

Within those scant reports, though, there are some truly incredible stories. A hospital in the UK, for instance, recently reported the case of a woman who had experienced long-lasting fertility problems. She then discovered that she had a tumour between her rectum and her uterus, but before doctors could operate, she finally conceived. All went well and a healthy baby was delivered – only for the doctors to find that the cancer had mysteriously vanished during the pregnancy. Nine years later, she shows no sign of relapse.

Similarly spectacular recoveries have now been recorded in many different kinds of cancer, including extremely aggressive forms like acute myeloid leukaemia, which involves the abnormal growth of white blood cells. “If you leave the patient untreated, they usually die within weeks, if not days,” says Armin Rashidi at Washington University in St Louis. Yet he has found 46 cases in which acute myeloid leukaemia regressed of its own accord, although only eight avoided a relapse in the long term. “If you find a random oncologist and ask if this can this happen, 99% would say no – it makes no sense,” says Rashidi, who worked with colleague Stephen Fisher on the paper.

Agonising wait

In contrast, dramatic recoveries from a childhood cancer called neuroblastoma are surprisingly frequent – offering some of the best clues about what might trigger spontaneous remission. This cancer arises from tumours in the nervous system and hormonal glands. If it then spreads, or metastasises, it can lead to nodules on the skin and growths in the liver, with swelling in the abdomen that makes it difficult for the infant to breathe.

Neuroblastoma is very distressing, yet it can sometimes disappear as quickly as it came, even without medical intervention. In fact, for infants less than one year old, regression is so common that doctors tend to avoid starting chemotherapy immediately, in the hope that the tumour will shrink by itself. “I can remember three cases with rather impressive skin metastases and an enlarged liver, but we literally just observed them – and they did well,” says Garrett Brodeur at the Children’s Hospital of Philadelphia.

The decision to sit and observe can be difficult, though: although the chance to avoid harrowing treatment comes as a relief to some parents, others find inaction and helplessness difficult to stomach. The agony of that period is one of the reasons that Brodeur wants to understand the mechanisms behind the cancer’s vanishing act. “We want to develop very specific agents that might initiate regression – so we don’t need to wait for nature to run its course or for ‘god’ to decide,” he says.

Vital clues

So far, Brodeur has some strong leads. For instance, unlike other nerve cells, the cells in neuroblastoma tumours seem to have developed the ability to survive without “nerve growth factor” (NGF) – allowing them to flourish in the wrong parts of the body where NGF is absent. Spontaneous remission may be triggered by a natural change in the neuroblastoma tumour cells, perhaps involving the cell receptors that NGF binds to. Whatever the change is, it might mean that the cells can no longer survive without the essential nutrient.

If so, a drug that targets those receptors could kick-start recovery in other patients. Brodeur says that two drug companies already have some candidates, and he hopes trials will begin soon. “It would selectively kill tumour cells that are sensitive to this pathway, so it could spare patients from chemotherapy, radiotherapy or surgery,” he says. “It wouldn’t make them sick or their hair fall out, or cause their blood cell count to fall.”

Friendly fire

Unfortunately, unexpected recoveries from other kinds of cancer have been less well studied, perhaps because of their rarity. But there are some clues, and they could come from the pioneering work of a little-known American doctor more than 100 years ago.

It was the late 19th Century, and William Bradley Coley was struggling to save a patient with a large tumour in his neck. Five operations had failed to eradicate the cancer. Then the patient caught a nasty skin infection with a scorching fever. By the time he’d recovered, the tumour was gone. Testing the principle on a small number of other patients, Coley found that deliberately infecting them with bacteria, or treating them with toxins harvested from microbes, destroyed otherwise inoperable tumours.

Could infection be the key to stimulating spontaneous remission more generally? Analyses of the recent evidence certainly make a compelling case for exploring the idea. Rashidi and Fisher’s study found that 90% of the patients recovering from leukaemia had suffered another illness such as pneumonia shortly before the cancer disappeared. Other papers have noted tumours vanishing after diphtheria, gonorrhoea, hepatitis, influenza, malaria, measles, smallpox and syphilis. What doesn’t kill you really can make you stronger in these strange circumstances.

It’s not the microbes, per se, that bring about the healing; rather, the infection is thought to trigger an immune response that is inhospitable to the tumour. The heat of the fever, for instance, may itself render the tumour cells more vulnerable, and trigger cell suicide. Or perhaps it’s significant that when we are fighting bacteria or viruses, our blood is awash with inflammatory molecules that are a call to arms for the body’s macrophages, turning these immune cells into warriors that kill and engulf microbes – and potentially the cancer too. “I think the infection changes the innate immune cells from helping the tumours to killing them,” says Henrik Schmidt at Aarhus University Hospital in Denmark. That, in turn, may also stimulate other parts of the immune system – such as our dendritic cells and T-cells – to learn to recognise the tumorous cells, so that they can attack the cancer again should it return.

Schmidt thinks that understanding the process of spontaneous remission is vital, since it could help refine the emerging class of “immunotherapies” that hijack our natural defences to combat cancer. In one treatment, for instance, doctors inject some cancer patients with inflammatory “cytokines” in order to kick the immune system into action. The side effects – such as high fever and flu-like symptoms – are typically treated with drugs like paracetamol, to improve the patient’s comfort.

But given that the fever itself may trigger remission, Schmidt suspected that the paracetamol might sap the treatment’s potency. Sure enough, he has found that more than twice as many patients – 25% versus 10% – survive past the two-year follow-up, if they were instead left to weather the fever.

There could be many other simple but powerful steps to improve cancer treatment inspired by these insights. One man experienced spontaneous remission after a tetanus and diphtheria vaccination, for instance – perhaps because vaccines also act as a call to arms for the immune system. Along these lines, Rashidi points out that a receiving standard vaccine booster – such as the BCG jab against tuberculosis – seems to reduce the chance of melanoma relapse after chemotherapy.

Catching a cure

Others are considering a far more radical line of attack. For instance, one approach aims to deliberately infect cancer patients with a tropical disease.

The technique, developed by American start-up PrimeVax, involves a two-pronged approach. It would begin by taking a sample of the tumour, and collecting dendritic cells from the patient’s blood. These cells help coordinate the immune system’s response to a threat, and by exposing them to the tumour in the lab, it is possible to programme them to recognise the cancerous cells. Meanwhile, the patient is given a dose of dengue fever, a disease normally carried by mosquitoes, before they are injected with the newly trained dendritic cells.

Under the supervision of doctors in a hospital, the patient would begin to develop a 40.5C fever, combined with the widespread release of inflammatory molecules – putting the rest of the immune system on red alert. Where the tumour was once able to lurk under the radar, it should now become a prime target for an intense attack from the immune cells, led by the programmed dendritic cells. “Dengue fever crashes and regroups the immune system, so that it is reset to kill tumour cells,” says Bruce Lyday at PrimeVax.

son. Can these cases provide vital clues for tackling cancer?
Related

 (Science Photo Library)
How venom can help tackle cancer
 (SPL)
Is fast food making us depressed?
 (Getty Images)
The thought that could kill you
 Blood prick on finger
Teen inventor tackles cancer
 (Getty Images)
The man with two hearts
 spl
Most cancers just “bad luck”
Most types of cancer can be put down to bad luck rather than risk factors such as smoking, a study has suggested.
It was a case that baffled everyone involved. The 74-year-old woman had initially been troubled by a rash that wouldn’t go away. By the time she arrived at the hospital, her lower right leg was covered in waxy lumps, eruptions of angry red and livid purple. Tests confirmed the worst suspicions: it was carcinoma, a form of skin cancer.

Just one in 100,000 cancer patients shed the disease – but why? — The secrets could inform medicine
The future looked bleak. Given the spread of the tumours, radiotherapy would not have been effective; nor could the doctors dig the tumours from the skin. Amputation was perhaps the best option, says Alan Irvine, the patient’s doctor at St James’ Hospital, Dublin – but at her age, she was unlikely to adapt well to a prosthetic limb. After a long and frank discussion, they decided to wait as they weighed up the options. “We had a lot of agonising for what to do,” says Irvine.

Then the “miracle” started. Despite receiving no treatment at all, the tumours were shrinking and shrivelling before their eyes. “We watched for a period of a few months and the tumours just disappeared,” says Irvine. After 20 weeks, the patient was cancer-free. “There had been no doubt about her diagnosis,” he says. “But now there was nothing in the biopsies, or the scans.”

Somehow, she had healed herself of arguably our most feared disease. “Everyone was thrilled, and a bit puzzled,” Irvine says, with some understatement. “It shows that it is possible for the body to clear cancer – even if it is incredibly rare.”

The question is, how? Irvine’s patient believed it was the hand of God; she had kissed a religious relic just before the healing set in. But scientists are instead looking to the underlying biology of so-called “spontaneous regression” to hunt for clues that could make these rare cases of self-healing more common. “If you can train the body to do this on a broader scale, you could have something that’s very widely applicable,” says Irvine.

Knowing how to trigger an immune response may help beat cancer (SPL)
Knowing how to trigger an immune response may help beat cancer (SPL)
In theory, our immune system should hunt out and destroy mutated cells before they ever develop into cancer. Occasionally, however, these cells manage to sneak under the radar, reproducing until they grow into a full-blown tumour.

By the time the cancer has reached the attention of doctors, unaided recovery is highly unlikely: overall, just one in 100,000 cancer patients are thought to shed the disease without treatment.

Disappearing act

Within those scant reports, though, there are some truly incredible stories. A hospital in the UK, for instance, recently reported the case of a woman who had experienced long-lasting fertility problems. She then discovered that she had a tumour between her rectum and her uterus, but before doctors could operate, she finally conceived. All went well and a healthy baby was delivered – only for the doctors to find that the cancer had mysteriously vanished during the pregnancy. Nine years later, she shows no sign of relapse.

What was it about the body of one pregnant woman that beat cancer? (Thinkstock)
What was it about the body of one pregnant woman that beat cancer? (Thinkstock)
Similarly spectacular recoveries have now been recorded in many different kinds of cancer, including extremely aggressive forms like acute myeloid leukaemia, which involves the abnormal growth of white blood cells. “If you leave the patient untreated, they usually die within weeks, if not days,” says Armin Rashidi at Washington University in St Louis. Yet he has found 46 cases in which acute myeloid leukaemia regressed of its own accord, although only eight avoided a relapse in the long term. “If you find a random oncologist and ask if this can this happen, 99% would say no – it makes no sense,” says Rashidi, who worked with colleague Stephen Fisher on the paper.

Agonising wait

In contrast, dramatic recoveries from a childhood cancer called neuroblastoma are surprisingly frequent – offering some of the best clues about what might trigger spontaneous remission. This cancer arises from tumours in the nervous system and hormonal glands. If it then spreads, or metastasises, it can lead to nodules on the skin and growths in the liver, with swelling in the abdomen that makes it difficult for the infant to breathe.

(Thinkstock)
(Thinkstock)
Neuroblastoma is very distressing, yet it can sometimes disappear as quickly as it came, even without medical intervention. In fact, for infants less than one year old, regression is so common that doctors tend to avoid starting chemotherapy immediately, in the hope that the tumour will shrink by itself. “I can remember three cases with rather impressive skin metastases and an enlarged liver, but we literally just observed them – and they did well,” says Garrett Brodeur at the Children’s Hospital of Philadelphia.

The decision to sit and observe can be difficult, though: although the chance to avoid harrowing treatment comes as a relief to some parents, others find inaction and helplessness difficult to stomach. The agony of that period is one of the reasons that Brodeur wants to understand the mechanisms behind the cancer’s vanishing act. “We want to develop very specific agents that might initiate regression – so we don’t need to wait for nature to run its course or for ‘god’ to decide,” he says.

Vital clues

So far, Brodeur has some strong leads. For instance, unlike other nerve cells, the cells in neuroblastoma tumours seem to have developed the ability to survive without “nerve growth factor” (NGF) – allowing them to flourish in the wrong parts of the body where NGF is absent. Spontaneous remission may be triggered by a natural change in the neuroblastoma tumour cells, perhaps involving the cell receptors that NGF binds to. Whatever the change is, it might mean that the cells can no longer survive without the essential nutrient.

(Thinkstock)
(Thinkstock)
If so, a drug that targets those receptors could kick-start recovery in other patients. Brodeur says that two drug companies already have some candidates, and he hopes trials will begin soon. “It would selectively kill tumour cells that are sensitive to this pathway, so it could spare patients from chemotherapy, radiotherapy or surgery,” he says. “It wouldn’t make them sick or their hair fall out, or cause their blood cell count to fall.”

Friendly fire

Unfortunately, unexpected recoveries from other kinds of cancer have been less well studied, perhaps because of their rarity. But there are some clues, and they could come from the pioneering work of a little-known American doctor more than 100 years ago.

It was the late 19th Century, and William Bradley Coley was struggling to save a patient with a large tumour in his neck. Five operations had failed to eradicate the cancer. Then the patient caught a nasty skin infection with a scorching fever. By the time he’d recovered, the tumour was gone. Testing the principle on a small number of other patients, Coley found that deliberately infecting them with bacteria, or treating them with toxins harvested from microbes, destroyed otherwise inoperable tumours.

(SPL)
(SPL)
Could infection be the key to stimulating spontaneous remission more generally? Analyses of the recent evidence certainly make a compelling case for exploring the idea. Rashidi and Fisher’s study found that 90% of the patients recovering from leukaemia had suffered another illness such as pneumonia shortly before the cancer disappeared. Other papers have noted tumours vanishing after diphtheria, gonorrhoea, hepatitis, influenza, malaria, measles, smallpox and syphilis. What doesn’t kill you really can make you stronger in these strange circumstances.

It’s not the microbes, per se, that bring about the healing; rather, the infection is thought to trigger an immune response that is inhospitable to the tumour. The heat of the fever, for instance, may itself render the tumour cells more vulnerable, and trigger cell suicide. Or perhaps it’s significant that when we are fighting bacteria or viruses, our blood is awash with inflammatory molecules that are a call to arms for the body’s macrophages, turning these immune cells into warriors that kill and engulf microbes – and potentially the cancer too. “I think the infection changes the innate immune cells from helping the tumours to killing them,” says Henrik Schmidt at Aarhus University Hospital in Denmark. That, in turn, may also stimulate other parts of the immune system – such as our dendritic cells and T-cells – to learn to recognise the tumorous cells, so that they can attack the cancer again should it return.

(SPL)
(SPL)
Schmidt thinks that understanding the process of spontaneous remission is vital, since it could help refine the emerging class of “immunotherapies” that hijack our natural defences to combat cancer. In one treatment, for instance, doctors inject some cancer patients with inflammatory “cytokines” in order to kick the immune system into action. The side effects – such as high fever and flu-like symptoms – are typically treated with drugs like paracetamol, to improve the patient’s comfort.

But given that the fever itself may trigger remission, Schmidt suspected that the paracetamol might sap the treatment’s potency. Sure enough, he has found that more than twice as many patients – 25% versus 10% – survive past the two-year follow-up, if they were instead left to weather the fever.

There could be many other simple but powerful steps to improve cancer treatment inspired by these insights. One man experienced spontaneous remission after a tetanus and diphtheria vaccination, for instance – perhaps because vaccines also act as a call to arms for the immune system. Along these lines, Rashidi points out that a receiving standard vaccine booster – such as the BCG jab against tuberculosis – seems to reduce the chance of melanoma relapse after chemotherapy.

Catching a cure

Others are considering a far more radical line of attack. For instance, one approach aims to deliberately infect cancer patients with a tropical disease.

Should we infect cancer patients with tropical diseases? (SPL)
Should we infect cancer patients with tropical diseases? (SPL)
The technique, developed by American start-up PrimeVax, involves a two-pronged approach. It would begin by taking a sample of the tumour, and collecting dendritic cells from the patient’s blood. These cells help coordinate the immune system’s response to a threat, and by exposing them to the tumour in the lab, it is possible to programme them to recognise the cancerous cells. Meanwhile, the patient is given a dose of dengue fever, a disease normally carried by mosquitoes, before they are injected with the newly trained dendritic cells.

Under the supervision of doctors in a hospital, the patient would begin to develop a 40.5C fever, combined with the widespread release of inflammatory molecules – putting the rest of the immune system on red alert. Where the tumour was once able to lurk under the radar, it should now become a prime target for an intense attack from the immune cells, led by the programmed dendritic cells. “Dengue fever crashes and regroups the immune system, so that it is reset to kill tumour cells,” says Bruce Lyday at PrimeVax.

(Thinkstock)
(Thinkstock)
Infecting vulnerable patients with a tropical illness may sound foolhardy, but dengue fever is less likely to kill the average adult than the common cold – making it the safest choice of infection. Importantly, once the fever has subsided, the programmed immune cells will remain on the lookout for the tumour, should it reappear. “Cancer is a moving target. Most therapies attack from just one side – but we’re trying to put it in a lose-lose situation, now and in the future,” says Lyday.

No one could fault the ambition behind this kind of therapy. “Our mission is to replicate spontaneous remission in as standardised way as possible,” says Lyday’s colleague Tony Chen. Even so, they are keen to emphasise that their idea is still at a very early stage of development – and they cannot know how it will play out until they begin a clinical trial. The first tests, they hope, will begin with advanced melanoma patients, perhaps by the end of the year.

Clearly, caution is necessary. As Irvine points out: “Spontaneous remission is a little clue in a big complicated jigsaw.” But if – and that is a massive if – they succeed, the implications would be staggering. A rapid, relatively painless recovery from cancer is now considered a miracle. The dream is that it might just become the norm.

SOURCE: http://www.bbc.com/future/story/20150306-the-mystery-of-vanishing-cancer

8
How a doctor’s words can make you ill
David Robson

A good bedside manner can help heal the body, but if doctors don’t choose their words carefully, they can also make you unwell. Have you ever visited a doctor, and come away feeling they weren’t much help? Listen to the following audio clip, and you might start to understand why.

During a role-play for BBC World Service’s Discovery programme, presenter Geoff Watts talks to Dr Mark Porter about problems with his knees. Throughout the interview, Porter’s words subtly create a negative impression for the patient. He says he has some “bad news” and the knees are “worn out” due to osteoarthritis; the drugs “help a bit” – but they may damage the lining of the stomach, he says.

As Watts goes on to discover, those subtle cues might actually exacerbate the physical symptoms. “The problem with the way I sold it, was that it validated your concerns that your knee’s falling apart, it’s crumbling, you’re doomed,” says Porter. “And the side effects I mentioned – I put them out of all proportion.”

Experiments have shown that simply warning people about certain side-effects can actually make them more likely to experience the nausea, fatigue, headaches or diarrhoea – even when they have been assigned innocuous pills rather than an active drug.

Healing words

Medicine has long known about the placebo effect – the healing power of good expectations. But the nocebo effect, as its evil twin is known, may be more powerful. “It’s easier to do harm than good,” explains Watts. “And this is worrisome, because nocebo’s negative influence can be found lurking in almost every aspect of medical life and beyond.”

In extreme circumstances it could even be deadly, as we recently explored in our article “The contagious thought that could kill you”. 

The good news is that, through the same power of the mind-body connection, a good bedside manner may do wonders for treatment. One study found that depressed patients given placebo pills by an empathetic doctor ended up with better results than those taking an active drug from a psychiatrist who seemed less concerned about their welfare. Some scientists have even hypothesised that doctors could try to make use of the placebo effect to reduce the dose given to patients – by using the power of their mind to make up the difference. “Healing is a real phenomenon. We all have the ability to self-heal in many conditions and that can be activated by our interactions with other people,” says Paul Dieppe at Exeter Medical School.

imple measures might include taking an empathetic and caring attitude during diagnosis, that considers the patient’s concerns and fears, however unlikely, says Porter. And when prescribing treatments, the doctor should emphasise the positive effects of the medicine, while framing the negative side-effects so they seem less frightening, and being careful not to over-emphasise their risks.

“Every word counts, every glance counts,” says Ted Kaptchuk of Harvard University. And it’s an opportunity that shouldn’t be missed. “I don’t think that’s going to be a burden for physicians or nurses. I think it’s going to be a way of making them feel a part of the treatment – that’s an awareness that’s just beginning in healthcare.”

SOURCE: http://www.bbc.com/future/story/20150309-the-simple-words-that-make-us-ill

9
Back-up brains: The era of digital immortality
 Simon Parkin

How do you want to be remembered? As Simon Parkin discovers, we may eventually be able to preserve our entire minds for generations to come – would you?

A few months before she died, my grandmother made a decision. Bobby, as her friends called her (theirs is a generation of nicknames), was a farmer’s wife who not only survived World War II but also found in it justification for her natural hoarding talent. ‘Waste not, want not’ was a principle she lived by long after England recovered from a war that left it buckled and wasted. So she kept old envelopes and bits of cardboard cereal boxes for note taking and lists. She kept frayed blankets and musty blouses from the 1950s in case she needed material to mend. By extension, she was also a meticulous chronicler. She kept albums of photographs of her family members. She kept the airmail love letters my late grandfather sent her while he travelled the world with the merchant navy in a box. Her home was filled with the debris of her memories.

Yet in the months leading up to her death, the emphasis shifted from hoarding to sharing. Every time I visited my car would fill with stuff: unopened cartons of orange juice, balls of fraying wool, damp, antique books, empty glass jars. All things she needed to rehome now she faced her mortality. The memories too began to move out. She sent faded photographs to her children, grandchildren and friends, as well as letters containing vivid paragraphs detailing some experience or other.

On 9 April, the afternoon before the night she died, she posted a letter to one of her late husband’s old childhood friends. In the envelope she enclosed some photographs of my grandfather and his friend playing as young children. “You must have them,” she wrote to him. It was a demand but also a plea, perhaps, that these things not be lost or forgotten when, a few hours later, she slipped away in her favourite armchair.

The hope that we will be remembered after we are gone is both elemental and universal. The poet Carl Sandburg captured this common feeling in his 1916 poem Troths:


Yellow dust on a bumblebee’s wing,
Grey lights in a woman’s asking eyes,
Red ruins in the changing sunset embers:
I take you and pile high the memories.
Death will break her claws on some I keep.

It is a wishful tribute to the potency of memories. The idea that a memory could prove so enduring that it might grant its holder immortality is a romantic notion that could only be held by a young poet, unbothered by the aches and scars of age.

Nevertheless, while Sandburg’s memories failed to save him, they survived him. Humans have, since the first paintings scratched on cave walls, sought to confound the final vanishing of memory. Oral history, diary, memoir, photography, film and poetry: all tools in humanity’s arsenal in the war against time’s whitewash. Today we bank our memories onto the internet’s enigmatic servers, those humming vaults tucked away in the cooling climate of the far North or South. There’s the Facebook timeline that records our most significant life events, the Instagram account on which we store our likeness, the Gmail inbox that documents our conversations, and the YouTube channel that broadcasts how we move, talk or sing. We collect and curate our memories more thoroughly than ever before, in every case grasping for a certain kind of immortality.

Is it enough? We save what we believe to be important, but what if we miss something crucial? What if some essential context to our words or photographs is lost? How much better it would be to save everything, not only the written thoughts and snapped moments of life, but the entire mind: everything we know and all that we remember, the love affairs and heartbreaks, the moments of victory and of shame, the lies we told and the truths we learned. If you could save your mind like a computer’s hard drive, would you? It’s a question some hope to pose to us soon. They are the engineers working on the technology that will be able create wholesale copies of our minds and memories that live on after we are burned or buried. If they succeed, it promises to have profound, and perhaps unsettling, consequences for the way we live, who we love and how we die.

Carbon copy

I keep my grandmother’s letters to me in a folder by my desk. She wrote often and generously. I also have a photograph of her in my kitchen on the wall, and a stack of those antique books, now dried out, still unread. These are the ways in which I remember her and her memories, saved in hard copy. But could I have done more to save her?

San Franciscan Aaron Sunshine’s grandmother also passed away recently. “One thing that struck me is how little of her is left,” the 30-year-old tells me. “It’s just a few possessions. I have an old shirt of hers that I wear around the house. There's her property but that's just faceless money. It has no more personality than any other dollar bill.” Her death inspired Sunshine to sign up with Eterni.me, a web service that seeks to ensure that a person’s memories are preserved after their death online.

It works like this: while you’re alive you grant the service access to your Facebook, Twitter and email accounts, upload photos, geo-location history and even Google Glass recordings of things that you have seen. The data is collected, filtered and analysed before it’s transferred to an AI avatar that tries to emulate your looks and personality. The avatar learns more about you as you interact with it while you’re alive, with the aim of more closely reflecting you as time progresses.

“It’s about creating an interactive legacy, a way to avoid being totally forgotten in the future,” says Marius Ursache, one of Eterni.me’s co-creators. “Your grand-grand-children will use it instead of a search engine or timeline to access information about you – from photos of family events to your thoughts on certain topics to songs you wrote but never published.” For Sunshine, the idea that he might be able to interact with a legacy avatar of his grandmother that reflected her personality and values is comforting. “I dreamt about her last night,” he says. “Right now a dream is the only way I can talk to her. But what if there was a simulation? She would somehow be less gone from my life.”

While Ursache has grand ambitions for the Eterni.me service (“it could be a virtual library of humanity”) the technology is in still its infancy. He estimates that subscribers will need to interact with their avatars for decades for the simulation to become as accurate as possible. He’s already received many messages from terminally ill patients who want to know when the service will be available – whether they can record themselves in this way before they die. “It’s difficult to reply to them, because the technology may take years to build to a level that’s useable and offers real value,” he says. But Sunshine is optimistic. “I have no doubt that someone will be able to create good simulations of people's personalities with the ability to converse satisfactorily,” he says. “It could change our relationship with death, providing some noise where there is only silence. It could create truer memories of a person in the place of the vague stories we have today.”

It could, I suppose. But what if the company one day goes under? As the servers are switched off, the people it homes would die a second death.

As my own grandmother grew older, some of her memories retained their vivid quality; each detail remained resolute and in place. Others became confused: the specifics shifted somehow in each retelling. Eterni.me and other similar services counter the fallibility of human memory; they offer a way to fix the details of a life as time passes. But any simulation is a mere approximation of a person and, as anyone who has owned a Facebook profile knows, the act of recording one’s life on social media is a selective process. Details can be tweaked, emphases can be altered, entire relationships can be erased if it suits one’s current circumstances. We often give, in other words, an unreliable account of ourselves.

Total recall

What if, rather than simply picking and choosing what we want to capture in digital form, it was possible to record the contents of a mind in their entirety? This work is neither science fiction nor the niche pursuit of unreasonably ambitious scientists. Theoretically, the process would require three key breakthroughs. Scientists must first discover how to preserve, non-destructively, someone's brain upon their death. Then the content of the preserved brain must be analysed and captured. Finally, that capture of the person’s mind must be recreated on a simulated human brain.

First, we must create an artificial human brain on which a back-up of a human’s memories would be able to ‘run’. Work in the area is widespread. MIT runs a course on the emergent science of ‘connectomics’, the work to create a comprehensive map of the connections in a human brain. The US Brain project is working to record brain activity from millions of neurons while the EU Brain project tries to build integrated models from this activity.

Anders Sandberg from the Future of Humanity Institute at Oxford University, who in 2008 wrote a paper titled Whole Brain Emulation: A Roadmap, describes these projects as “stepping stones” towards being able to fully able to emulate the human brain.

“The point of brain emulation is to recreate the function of the original brain: if ‘run’ it will be able to think and act as the original,” he says. Progress has been slow but steady. “We are now able to take small brain tissue samples and map them in 3D. These are at exquisite resolution, but the blocks are just a few microns across. We can run simulations of the size of a mouse brain on supercomputers – but we do not have the total connectivity yet. As methods improve I expect to see automatic conversion of scanned tissue into models that can be run. The different parts exist, but so far there is no pipeline from brains to emulations.”

Investment in the area appears to be forthcoming, however. Google is heavily invested in brain emulation. In December 2012 the company appointed Ray Kurzweil as its director of engineering on the Google Brain project, which aims to mimic aspects of the human brain. Kurzweil, a divisive figure, is something of a figurehead for a community of scientists who believe that it will be possible to create a digital back-up of a human brain within their lifetime. A few months later, the company hired Geoff Hinton, a British computer scientist who is one of the world's leading experts on neural networks, essentially the circuitry of how the human mind thinks and remembers.

Google is not alone, either. In 2011 a Russian entrepreneur, Dmitry Itskov, founded ‘The 2045 Initiative’, named after Kurzweil’s prediction that the year 2045 will mark the point at which we’ll be able to back up our minds to the cloud. While the fruits of all this work are, to date, largely undisclosed, the effort is clear.

Neuroscientist Randal Koene,science director for the 2045 Initiative, is adamant that creating a working replica of a human brain is within reach. “The development of neural prostheses already demonstrate that running functions of the mind is possible,” he says. It’s not hyperbole. Ted Berger, a professor at the University of Southern California’s Center for Neuroengineering has managed to create a working prosthetic of the hippocampus part of the brain. In 2011 a proof-of-concept hippocampal prosthesis was successfully tested in live rats and, in 2012 the prosthetic was successfully tested in non-human primates. Berger and his team intend to test the prosthesis in humans this year, demonstrating that we are already able to recreate some parts of the human brain.

Memory dump

Emulating a human brain is one thing, but creating a digital record of a human’s memories is a different sort of challenge. Sandberg is cynical of whether this simplistic process is viable. “Memories are not neatly stored like files on a computer to create a searchable index,” he says. “Memory consists of networks of associations that are activated when we remember. A brain emulation would require a copy of them all.”

Indeed, humans reconstruct information from multiple parts of the brain in ways that are shaped by our current beliefs and biases, all of which change over time. These conclusions appear at odds with any effort to store memories in the same way that a computer might record data for easy access. It is an idea based on, as one sceptic I spoke to (who wished to remain anonymous) put it, “the wrong and old-fashioned ‘possession’ view of memory”.

There is also the troubling issue of how to extract a person’s memories without destroying the brain in the process. “I am sceptical of the idea that we will be able to accomplish non-destructive scanning,” says Sandberg. “All methods able to scan neural tissue at the required high resolution are invasive, and I suspect this will be very hard to achieve without picking apart the brain.” Nevertheless, the professor believes a searchable, digital upload of a specific individual’s memory could be possible so long as you were able to “run” the simulated brain in its entirety.

“I think there is a good chance that it could work in reality, and that it could happen this century,” he says. “We might need to simulate everything down to the molecular level, in which case the computational demands would simply be too large. It might be that the brain uses hard-to-scan data like quantum states (an idea believed by some physicists but very few neuroscientists), that software cannot be conscious or do intelligence (an idea some philosophers believe but few computer scientists), and so on. I do not think these problems apply, but it remains to be seen if I am right.”

If it could be done, then, what would preserving a human mind mean for the way we live?

Some believe that there could be unanticipated benefits, some of which can make the act of merely extending a person’s life for posterity seem rather plain by comparison. For example, David Wood, chairman of the London Futurists, argues that a digital back-up of a person’s mind could be studied, perhaps providing breakthroughs in understanding the way in which human beings think and remember.

And if a mind could be digitally stored while a person was still alive then, according to neuroscientist Andrew A Vladimirov, it might be possible to perform psychoanalysis using such data. “You could run specially crafted algorithms through your entire life sequence that will help you optimise behavioural strategies,” he says.

Yet there’s also an unusual set of moral and ethical implications to consider, many of which are only just beginning to be revealed. “In the early stages the main ethical issue is simply broken emulations: we might get entities that are suffering in our computers,” says Sandberg. “There are also going to be issues of volunteer selection, especially if scanning is destructive.” Beyond the difficulty of recruiting people who are willing to donate their minds in such a way, there is the more complicated issue of what rights an emulated mind would enjoy. “Emulated people should likely have the same rights as normal people, but securing these would involve legislative change,” says Sandberg. “There might be the need for new kinds of rights too. For example, the right for an emulated human to run in real-time so that they can participate in society.”

Defining the boundaries of a person’s privacy is already a pressing issue for humanity in 2015, where third-party corporations and governments hold more insight into our personal information than ever before. For an emulated mind, privacy and ownership of data becomes yet more complicated. “Emulations are vulnerable and can suffer rather serious breaches of privacy and integrity,” says Sandberg. He adds, in a line that could be lifted from a Philip K Dick novel: “We need to safeguard their rights”. By way of example, he suggests that lawmakers would need to consider whether it should be possible to subpoena memories.

Property laws

“Ownership of specific memories is where things become complex,” says Koene. “In a memoir you can choose which memories are recorded. But if you don't have the power of which of your memories others can inspect it becomes a rather different question.” Is it a human right to be able to keep secrets?

These largely un-interrogated questions also begin to touch on more fundamental issues of what it means to be human. Would an emulated brain be considered human and, if so, does the humanity exist in the memories or the hardware on which the simulated brain runs? If it's the latter, there’s the question of who owns the hardware: an individual, a corporation or the state? If an uploaded mind requires certain software to run (a hypothetical Google Brain, for example) the ownership of the software license could become contentious.

The knowledge that one’s brain is to be recorded in its entirety might also lead some to behave differently during life. “I think it would have the same effect as knowing your actions will be recorded on camera,” says Sandberg. “In some people this knowledge leads to a tendency to conform to social norms. In others it produces rebelliousness. If one thinks that one will be recreated as a brain emulation then it is equivalent to expecting an extra, post-human life.”

Even if it were possible to digitally record the contents and psychological contours of the human mind, there are undeniably deep and complicated implications. But beyond this, there is the question of whether this is something that any of us truly want. Humans long to preserve their memories (or, in some cases, to forget them) because they remind us of who we are. If our memories are lost we cease to know who we were, what we accomplished, what it all meant. But at the same time, we tweak and alter our memories in order to create the narrative of our lives that fits us at any one time. To have everything recorded with equal weight and importance might not be useful, either to us or to those who follow us.

Where exactly is the true worth of the endeavour? Could it actually be the comforting knowledge for a person that they, to one degree or other, won’t be lost without trace? The survival instinct is common to all life: we eat, we sleep, we fight and, most enduringly, we reproduce. Through our descendants we reach for a form of immortality, a way to live on beyond our physical passing. All parents take part in a grand relay race through time, passing the gene baton on and on through the centuries. Our physical traits – those eyes, that hair, this temperament – endure in some diluted or altered form. So too, perhaps, do our metaphysical attributes (“what will survive of us is love,” as Philip Larkin tentatively put it in his 1956 poem, ‘An Arundel Tomb’). But it is the mere echo of immortality. Nobody lives forever; with death only the fading shadow of our life remains. There are the photographs of us playing as children. There are the antique books we once read. There is the blouse we once wore.

I ask Sunshine why he wants his life to be recorded in this way. “To be honest, I'm not sure,” he says. “The truly beautiful things in my life such as the parties I've thrown, the sex I've had, the friendships I’ve enjoyed. All of these things are too ephemeral to be preserved in any meaningful way. A part of me wants to build monuments to myself. But another part of me wants to disappear completely.” Perhaps that is true of us all: the desire to be remembered, but only the parts of us that we hope will be remembered. The rest can be discarded.

Despite my own grandmother’s careful distribution of her photographs prior to her death, many remained in her house. These eternally smiling, fading unknown faces evidently meant a great deal to her in life but now, without the framing context of her memories, they lost all but the most superficial meaning. In a curious way, they became a burden to those of us left behind.

My father asked my grandmother’s vicar (a kindly man who had been her friend for many years), what he should do with the pictures; to just throw the photographs away seemed somehow flippant and disrespectful. The vicar’s advice was simple. Take each photograph. Look at it carefully. In that moment you honour the person captured. Then you may discard of it and be free.

SOURCE: http://www.bbc.com/future/story/20150122-the-secret-to-immortality

10
The truth about technology’s greatest myth
Tom Chatfield


Many optimists believe that technology can transform society, whether it’s the internet or the latest phone. But as Tom Chatfield argues in his final column for BBC Future, the truth about our relationship with technology is far more interesting.

Lecturing in late 1968, the American sociologist Harvey Sacks addressed one of the central failures of technocratic dreams. We have always hoped, Sacks argued, that “if only we introduced some fantastic new communication machine the world will be transformed.” Instead, though, even our best and brightest devices must be accommodated within existing practices and assumptions in a “world that has whatever organisation it already has.”

As an example, Sacks considered the telephone. Introduced into American homes during the last quarter of the 19th Century, instantaneous conversation across hundreds or even thousands of miles seemed close to a miracle. For Scientific American, editorializing in 1880, this heralded “nothing less than a new organization of society – a state of things in which every individual, however secluded, will have at call every other individual in the community, to the saving of no end of social and business complications…”

Yet the story that unfolded was not so much “a new organization of society” as the pouring of existing human behaviour into fresh moulds: our goodness, hope and charity; our greed, pride and lust. New technology didn’t bring an overnight revolution. Instead, there was strenuous effort to fit novelty into existing norms.

The most ferocious early debates around the telephone, for example, concerned not social revolution, but decency and deception. What did access to unseen interlocutors imply for the sanctity of the home – or for gullible or corruptible members of the household, such as women or servants? Was it disgraceful to chat while improperly dressed? Such were the daily concerns of 19th-century telephonics, matched by phone companies’ attempts to assure subscribers of their propriety.

As Sacks also put it, each new object is above all “the occasion for seeing again what we can see anywhere” – and perhaps the best aim for any writing about technology is to treat novelty as not as an end, but as an opportunity to re-scrutinize ourselves.

I’ve been writing this fortnightly column since the start of 2012, and in the last two years have watched new devices and services become part of similar negotiations. By any measure, ours is an age preoccupied with novelty. Too often, though, it offers a road not to insight, but to a startling blindness about our own norms and assumptions.

Take the litany of numbers within which every commentary on modern tech is couched. Come the end of 2014, there will be more mobile phones in the world than people. We have moved from the launch of modern tablet computing in mid-2011 to tablets likely accounting for over half the global market in personal computers in 2014. Ninety per cent of the world’s data was created in the last two years. Today’s phones are more powerful than yesterday’s supercomputers. Today’s software is better than us at everything from chess to quiz shows. And so on.

Singularity myth

It’s a story in which both machines and their capabilities increase for ever, dragging us along for the exponential ride. Perhaps the defining geek myth of our age, The Singularity, anticipates a future in which machines cross an event horizon beyond which their intellects exceed our own. And while most people remain untouched by such faith, the apocalyptic eagerness it embodies is all too familiar. Surely it’s only a matter of time – the theory goes – before we finally escape, augment or otherwise overcome our natures and emerge into some new phase of the human story.

Or not. Because – while technological and scientific progress is indeed an astonishing thing – its relationship with human progress is more aspiration than established fact. Whether we like it or not, acceleration cannot continue indefinitely. We may long to escape flesh and history, but the selves we are busy reinventing come equipped with the same old gamut of beauties, perversities and all-too-human failings. In time, our dreams of technology departing mere actuality – and taking us along for the ride – will come to seem as quaint as Victorian gentlemen donning evening dress to make a phonecall.

This is one reason why, over the last two years, I’ve devoted a fair share of columns to the friction between the stories we tell about tech and its actual unfolding in our lives. From the surreptitious erosion of digital history to the dumbness of “smart” tech, via email’s dirty secrets and the importance of forgetfulness, I love exploring the tensions between digital tools and analogue selves – not because technology is to be dismissed or deplored, but because it remains as mired in history, politics and human frailty as everything else we touch.

This will be the last regular Life:Connected column I write for BBC Future. Instead, I’ll be writing a book about one of my obsessions: attention, and how its quantification and sale have become a battleground for 21st Century selves. I will, however, continue examining technology’s impact here and elsewhere – and asking what it means to watch ancient preoccupations poured into fresh, astounding moulds.

On which note: what do you think is most ripe for abandonment around technology today? Which habit will come to be seen by future generations as quaint – our equivalent of putting on bow ties for telephones? If you want to stay in touch, tweet me at @TomChatfield and let me know what you think.

SOURCE: http://www.bbc.com/future/story/20140110-technologys-greatest-myth

11
Journalism & Mass Communication / Are you ‘over-connected’?
« on: March 10, 2015, 05:25:12 PM »
Are you ‘over-connected’?
Tom Chatfield

Wander the city in 2015 and all you’ll see is people staring at screens or talking on handsets. Is it changing who we are? Tom Chatfield weighs up the arguments.
A group of people wait by a monument, unaware of each other’s existence. A woman strides open-mouthed down a busy street, holding one hand across her heart. Two young men – brothers? – stand behind a white fence, both their heads bowed at the same angle.

These are some of the moments captured in photographer Josh Pulman’s ongoing series called Somewhere Else, which documents people using mobile phones in public places (see pictures). Almost every street in every city across the world is packed with people doing this – something that didn’t exist a few decades ago. We have grown accustomed to the fact that shared physical space no longer means shared experience. Everywhere we go, we carry with us options far more enticing than the place and moment we happen to be standing within: access to friends, family, news, views, scandals, celebrity, work, leisure, information, rumour.

Little wonder that we are transfixed; that the faces in Pulman’s images ripple with such emotion. We are free, if “free” is the right word, to beam stimulation or distraction into our brains at any moment. Via the screens we carry – and will soon be wearing – it has never been easier to summon those we love, need, care about or rely upon.

Yet, as Pulman himself asks, “If two people are walking down the street together both on the phone to someone else, are they really together? And what is the effect on the rest of us of such public displays of emotion, whether it’s anxiety, rage or joy?” To be human is to crave connection. But can our talent betray us? Is it possible to be “overconnected” – and, if so, what does it mean for our future?

Life down a line

Telephones have been both an engine of social disruption and a focus for technological anxiety ever since their invention. Imagine the scene through 19th Century eyes, when the infrastructure of the first telephone networks began to be laid out: mile upon mile of wires hung along the side of public roads, piercing every house in turn. Walls were being breached: the sanctum of the home plugged into a new species of human interaction.

The electric telegraph had already given the world something miraculous: messaging at the speed of electricity. Telephones, though, bore not the business-like dots and dashes or Morse code, but the human voice, whispering out of the ether into any willing listener’s ear. “We shall soon be nothing but transparent heaps of jelly to each other," lamented a British writer in 1897, fearing privacy’s replacement by the promiscuity of a new media age: one in which there was nowhere for the unmediated self to hide.

Doom-laden warnings over new technology are nothing new, as I described recently in a programme for BBC Radio 4. Here’s a clip, pointing out why the practice goes back at least to the Ancient Greeks:

Hear the full BBC Radio 4 programme: Has technology rewired our brains?

Still, while early fears about the telephone may have been exaggerated, they were also prophetic. If one great technological drive of the late 19th and 20th Centuries was to plug every place of work and leisure into networks of power, transport and communications, then the emerging story of the 21st Century is the interconnection of our own minds into a similarly networked state. We’re no longer drilling holes in the walls of our houses for telephone wires. It’s ourselves we’re plugging in; and we’re starting to feel the strain.

Always on
Like its 19th Century ancestors, the mobile phone began as a status symbol for the busy and affluent: a weighty hunk of the cutting edge, to be bellowed into as publicly as possible. Over time, the luxury became universal, the symbol splintered into countless social circumstances. We began to weave constant availability into our conception of public and private space; into our body language and everyday etiquette (“I’ll get there for midday and give you a ring”). Being uncontactable has become exceptional, outlandish, a brand of luxury and distinction – or, depending on your perspective, a source of escalating anxiety in itself.

And, like history repeating itself, warnings of the ill effects of mobile communication are once again rising – a focus for angst in an age where our ambivalence about constant connection conceals the more pressing question of what, precisely, we’re connecting to.

Consider the ease with which a news story spread recently about a 31-year-old-man treated for “internet addiction disorder,” related to his excessive use of Google Glass (a technology since shelved in the name of redevelopment). In many ways, using Google Glass is like strapping a smartphone to your face. A wearable device boasting built-in camera, microphone, tiny screen and internet connectivity, it’s activated either via voice or by a gentle tap of the fingers. Doctors noted that the subject compulsively mimicked this movement, moving his right hand up to his temple and tapping his skull even when he was not wearing Glass. He had been using it for up to 18 hours a day, and at night dreamed that he was looking at the world through the device.

This is a scare story tailor-made for our times. A troubled life (the man in question had a history of mood disorder and alcohol misuse) meets a seduction too great to resist and sinks into addiction. For some readers, though, I suspect it also raises nervous questions. How often do your own hands twitch involuntarily towards your phone, or the spot where you normally keep it? How does the buzz of each arriving message make you feel – or its absence when there’s no network? How far does the prison of an addict’s life echo your own relationships with technology?

The problem is, these aren’t questions with definitive answers. Drawing a line between habit and pathology means deciding what we mean by normal, healthy and acceptable behaviour. And if technology excels at one thing, it’s at shifting old norms faster than even the nimblest neophyte can handle. I’ve spent years trying to evaluate our relationships with technology, and still find myself pulled in two different directions.

On the one hand, as the philosopher Julian Baggini once put it to me, “human beings may be changing but in many ways we remain very much the same”. I can still read translations of ancient Roman or Greek literature and know exactly what their authors mean when they talk about anger, passion, patriotism, trust, betrayal.

On the other hand, digital technologies mean my relationships with others and the world are extended and amplified beyond anything even my grandparents knew. I outsource memories, routines, habits and responsibilities to ubiquitous hardware; I gratefully automate everything from route-finding and research to recommending movies.

As philosophers like Andy Clark and David J Chalmers have argued, my mind is a kind of collaboration between the brain in my head and tools like the phone in my hand: “I” am a complex system that encompasses both. And why shouldn’t I simply celebrate this ease, much as I do the freedoms that come with owning a car or a dishwasher, or wearing glasses to correct my sight?

One objection is that, even if you don’t buy into the hypothesis that my phone is effectively a handheld piece of my mind, it’s hard to ignore the mounting evidence around human cognition’s vulnerabilities. We are not only creatures of habit; we are also creatures of limited and easily exhausted conscious scrutiny. Distract or tire someone – give them a few mental arithmetic problems to solve, flash adverts at the corners of their vision – and their willpower is depleted. “Nudging” our every decision is now a science fed by billions of bits of data. And what better mechanism for tiring even the sharpest thinker than the tireless buzz of hardware in our pockets and software in its encircling cloud?

It’s this exponential impact of information technology that poses the greatest problem for everything we used to think about as normal, balanced, self-knowing and self-regulating. We live in an age of suffusion, and our pathologies are those of excess. Junk food, engineered to a tastiness we cannot stop cramming into our mouths. Junk media, junk information, junk time – attention-seeking algorithmic twitches seeking to become part of the patterns of our minds.

Time off
Do we need to diet, to detox? Whether it’s physical or mental health you’re talking about, neither works for most people – or begins to address the causes of excess. What’s the point of unplugging if the only reason for doing so is to plug yourself still more eagerly back in at a later date? Better to face facts, and to begin with the extraordinary intimacy of a relationship that is only going to get closer: between the brains in our bodies and the glistening webs of automation we’re weaving between them.

After all, I’m pouring my hours and minutes not simply into a screen, but into the most comprehensive networking of human minds ever achieved, each one more powerful than the fastest computer. If I’m so often enthralled, appalled, over-engaged, distracted and delighted, it’s because there are others out there sifting and refracting this world of information right back at me. And if I’m going to change this, it’s only going to happen if I can find others with whom I can build new habits, patterns and modes of practice.

To quote my exchange with Julian Baggini once again, there’s a paradox underpinning the power of even the most intricate technological manipulations: that “the methods used to manipulate us are more sophisticated than ever, but precisely because knowledge of how to do this has grown, we are more able to defend ourselves”. For instance, I don’t need to know everything there is to know about privacy, hacking and encryption to protect myself against government snooping. If I can find expert, reliable advice on protecting myself, I can at least begin the journey towards greater control and engagement.

In this sense, machines themselves are a misleading target for anxiety. Toxic offline communities and systems abound; technology, as it has always done, facilitates interactions at each extreme of the human spectrum. It may be hard to disconnect, but we can seek better to control who we connect with and what we ask of each other.

My favourite photograph in Josh Pulman’s series “Somewhere Else”, the eighth, is unusual because the woman in it is smiling (see image, above). I have no idea why she’s smiling, but I suspect it’s in response to the voice crackling into her ear; good news, relief, a joke. Everyone else caught on their phones seems anxious, alarmed, unhappily torn between worlds. But she is glad to be elsewhere, and I assume her partner in conversation is too. The pattern is rich enough not to be a prison; two minds are delightedly spanning the earth.

SOURCE: http://www.bbc.com/future/story/20150310-are-you-over-connected

12
Journalism & Mass Communication / Impressions of Gaza: Noam Chomsky
« on: March 10, 2015, 05:11:37 PM »
Impressions of Gaza
Noam Chomsky
chomsky.info, November 4, 2012

Even a single night in jail is enough to give a taste of what it means to be under the total control of some external force. And it hardly takes more than a day in Gaza to begin to appreciate what it must be like to try to survive in the world’s largest open-air prison, where a million and a half people, in the most densely populated area of the world, are constantly subject to random and often savage terror and arbitrary punishment, with no purpose other than to humiliate and degrade, and with the further goal of ensuring that Palestinian hopes for a decent future will be crushed and that the overwhelming global support for a diplomatic settlement that will grant these rights will be nullified.

The intensity of this commitment on the part of the Israeli political leadership has been dramatically illustrated just in the past few days, as they warn that they will “go crazy” if Palestinian rights are given limited recognition at the UN. That is not a new departure. The threat to “go crazy” (“nishtagea”) is deeply rooted, back to the Labor governments of the 1950s, along with the related “Samson Complex”: we will bring down the Temple walls if crossed. It was an idle threat then; not today.

The purposeful humiliation is also not new, though it constantly takes new forms. Thirty years ago political leaders, including some of the most noted hawks, submitted to Prime Minister Begin a shocking and detailed account of how settlers regularly abuse Palestinians in the most depraved manner and with total impunity. The prominent military-political analyst Yoram Peri wrote with disgust that the army’s task is not to defend the state, but “to demolish the rights of innocent people just because they are Araboushim (“niggers,” “kikes”) living in territories that God promised to us.”

Gazans have been selected for particularly cruel punishment. It is almost miraculous that people can sustain such an existence. How they do so was described thirty years ago in an eloquent memoir by Raja Shehadeh (The Third Way), based on his work as a lawyer engaged in the hopeless task of trying to protect elementary rights within a legal system designed to ensure failure, and his personal experience as a Samid, “a steadfast one,” who watches his home turned into a prison by brutal occupiers and can do nothing but somehow “endure.”

Since Shehadeh wrote, the situation has become much worse. The Oslo agreements, celebrated with much pomp in 1993, determined that Gaza and the West Bank are a single territorial entity. By then the US and Israel had already initiated their program of separating them fully from one another, so as to block a diplomatic settlement and punish the Araboushim in both territories.

Punishment of Gazans became still more severe in January 2006, when they committed a major crime: they voted the “wrong way” in the first free election in the Arab world, electing Hamas. Demonstrating their passionate “yearning for democracy,” the US and Israel, backed by the timid European Union, at once imposed a brutal siege, along with intensive military attacks. The US also turned at once to standard operating procedure when some disobedient population elects the wrong government: prepare a military coup to restore order.

Gazans committed a still greater crime a year later by blocking the coup attempt, leading to a sharp escalation of the siege and military attacks. These culminated in winter 2008-9, with Operation Cast Lead, one of the most cowardly and vicious exercises of military force in recent memory, as a defenseless civilian population, trapped with no way to escape, was subjected to relentless attack by one of the world’s most advanced military systems relying on US arms and protected by US diplomacy. An unforgettable eyewitness account of the slaughter — “infanticide” in their words — is given by the two courageous Norwegian doctors who worked at Gaza’s main hospital during the merciless assault, Mads Gilbert and Erik Fosse, in their remarkable book Eyes in Gaza.

President-elect Obama was unable to say a word, apart from reiterating his heartfelt sympathy for children under attack — in the Israeli town Sderot. The carefully planned assault was brought to an end right before his inauguration, so that he could then say that now is the time to look forward, not backward, the standard refuge of criminals.

Of course, there were pretexts — there always are. The usual one, trotted out when needed, is “security”: in this case, home-made rockets from Gaza. As is commonly the case, the pretext lacked any credibility. In 2008 a truce was established between Israel and Hamas. The Israeli government formally recognizes that Hamas observed it fully. Not a single Hamas rocket was fired until Israel broke the truce under cover of the US election on November 4 2008, invading Gaza on ludicrous grounds and killing half a dozen Hamas members. The Israeli government was advised by its highest intelligence officials that the truce could be renewed by easing the criminal blockade and ending military attacks. But the government of Ehud Olmert, reputedly a dove, chose to reject these options, preferring to resort to its huge comparative advantage in violence: Operation Cast Lead. The basic facts are reviewed once again by foreign policy analyst Jerome Slater in the current issue of the Harvard-MIT journal International Security.

The pattern of bombing under Cast Lead was carefully analyzed by the highly informed and internationally respected Gazan human rights advocate Raji Sourani. He points out that the bombing was concentrated in the north, targeting defenseless civilians in the most densely populated areas, with no possible military pretext. The goal, he suggests, may have been to drive the intimidated population to the south, near the Egyptian border. But the Samidin stayed put, despite the avalanche of US-Israeli terror.

A further goal might have been to drive them beyond. Back to the earliest days of the Zionist colonization it was argued across much of the spectrum that Arabs have no real reason to be in Palestine; they can be just as happy somewhere else, and should leave — politely “transferred,” the doves suggested. This is surely no small concern in Egypt, and perhaps a reason why Egypt does not open the border freely to civilians or even to desperately needed materials

Sourani and other knowledgeable sources observe that the discipline of the Samidin conceals a powder keg, which might explode any time, unexpectedly, as the first Intifada did in Gaza in 1989 after years of miserable repression that elicited no notice or concern,

Merely to mention one of innumerable cases, shortly before the outbreak of the Intifada a Palestinian girl, Intissar al-Atar, was shot and killed in a schoolyard by a resident of a nearby Jewish settlement. He was one of the several thousand Israelis settlers brought to Gaza in violation of international law and protected by a huge army presence, taking over much of the land and scarce water of the Strip and living “lavishly in twenty-two settlements in the midst of 1.4 million destitute Palestinians,” as the crime is described by Israeli scholar Avi Raz. The murderer of the schoolgirl, Shimon Yifrah, was arrested, but quickly released on bail when the Court determined that “the offense is not severe enough” to warrant detention. The judge commented that Yifrah only intended to shock the girl by firing his gun at her in a schoolyard, not to kill her, so “this is not a case of a criminal person who has to be punished, deterred, and taught a lesson by imprisoning him.” Yifrah was given a 7-month suspended sentence, while settlers in the courtroom broke out in song and dance. And the usual silence reigned. After all, it is routine.

And so it is. As Yifrah was freed, the Israeli press reported that an army patrol fired into the yard of a school for boys aged 6 to 12 in a West Bank refugee camp, wounding five children, allegedly intending only “to shock them.” There were no charges, and the event again attracted no attention. It was just another episode in the program of “illiteracy as punishment,” the Israeli press reported, including the closing of schools, use of gas bombs, beating of students with rifle butts, barring of medical aid for victims; and beyond the schools a reign of more severe brutality, becoming even more savage during the Intifada, under the orders of Defense Minister Yitzhak Rabin, another admired dove.

My initial impression, after a visit of several days, was amazement, not only at the ability to go on with life, but also at the vibrancy and vitality among young people, particularly at the university, where I spent much of my time at an international conference. But there too one can detect signs that the pressure may become too hard to bear. Reports indicate that among young men there is simmering frustration, recognition that under the US-Israeli occupation the future holds nothing for them. There is only so much that caged animals can endure, and there may be an eruption, perhaps taking ugly forms — offering an opportunity for Israeli and western apologists to self-righteously condemn the people who are culturally backward, as Mitt Romney insightfully explained.

Gaza has the look of a typical third world society, with pockets of wealth surrounded by hideous poverty. It is not, however, “undeveloped.” Rather it is “de-developed,” and very systematically so, to borrow the terms of Sara Roy, the leading academic specialist on Gaza. The Gaza Strip could have become a prosperous Mediterranean region, with rich agriculture and a flourishing fishing industry, marvelous beaches and, as discovered a decade ago, good prospects for extensive natural gas supplies within its territorial waters.   

By coincidence or not, that is when Israel intensified its naval blockade, driving fishing boats toward shore, by now to 3 miles or less.

The favorable prospects were aborted in 1948, when the Strip had to absorb a flood of Palestinian refugees who fled in terror or were forcefully expelled from what became Israel, in some cases expelled months after the formal cease-fire.

In fact, they were being expelled even four years later, as reported in Ha’aretz (25.12.2008), in a thoughtful study by Beni Tziper on the history of Israeli Ashkelon back to the Canaanites. In 1953, he reports, there was a “cool calculation that it was necessary to cleanse the region of Arabs.” The original name, Majdal, had already been “Judaized” to today’s Ashkelon, regular practice.

That was in 1953, when there was no hint of military necessity. Tziper himself was born in 1953, and while walking in the remnants of the old Arab sector, he reflects that “it is really difficult for me, really difficult, to realize that while my parents were celebrating my birth, other people were being loaded on trucks and expelled from their homes.”

Israel’s 1967 conquests and their aftermath administered further blows. Then came the terrible crimes already mentioned, continuing to the present day.

The signs are easy to see, even on a brief visit. Sitting in a hotel near the shore, one can hear the machine gun fire of Israeli gunboats driving fishermen out of Gaza’s territorial waters and towards shore, so they are compelled to fish in waters that are heavily polluted because of US-Israeli refusal to allow reconstruction of the sewage and power systems that they destroyed.

The Oslo Accords laid plans for two desalination plants, a necessity in this arid region. One, an advanced facility, was built: in Israel. The second one is in Khan Yunis, in the south of Gaza. The engineer in charge of trying to obtain potable water for the population explained that this plant was designed so that it cannot use sea water, but must rely on underground water, a cheaper process, which further degrades the meager aquifer, guaranteeing severe problems in the future. Even with that, water is severely limited. The United Nations Relief and Works Agency (UNRWA), which cares for refugees (but not other Gazans), recently released a report warning that damage to the aquifer may soon become “irreversible,” and that without remedial action quickly, by 2020 Gaza may not be a “liveable place.”

Israel permits concrete to enter for UNRWA projects, but not for Gazans engaged in the huge reconstruction needs. The limited heavy equipment mostly lies idle, since Israel does not permit materials for repair. All of this is part of the general program described by Israeli official Dov Weisglass, an adviser to Prime Minister Ehud Olmert, after Palestinians failed to follow orders in the 2006 elections: “The idea,” he said, “is to put the Palestinians on a diet, but not to make them die of hunger.” That would not look good.

And the plan is being scrupulously followed. Sara Roy has provided extensive evidence in her scholarly studies. Recently, after several years of effort, the Israeli human rights organization Gisha succeeded to obtain a court order for the government to release its records detailing plans for the diet, and how they are executed. Israel-based journalist Jonathan Cook summarizes them: “Health officials provided calculations of the minimum number of calories needed by Gaza’s 1.5 million inhabitants to avoid malnutrition. Those figures were then translated into truckloads of food Israel was supposed to allow in each day ... an average of only 67 trucks — much less than half of the minimum requirement — entered Gaza daily. This compared to more than 400 trucks before the blockade began.” And even this estimate is overly generous, UN relief officials report.

The result of imposing the diet, Mideast scholar Juan Cole observes, is that “[a]bout ten percent of Palestinian children in Gaza under 5 have had their growth stunted by malnutrition ... in addition, anemia is widespread, affecting over two-thirds of infants, 58.6 percent of schoolchildren, and over a third of pregnant mothers.” The US and Israel want to ensure that nothing more than bare survival is possible.

“What has to be kept in mind,” observes Raji Sourani, “is that the occupation and the absolute closure is an ongoing attack on the human dignity of the people in Gaza in particular and all Palestinians generally. It is systematic degradation, humiliation, isolation and fragmentation of the Palestinian people.” The conclusion is confirmed by many other sources. In one of the world’s leading medical journals, The Lancet, a visiting Stanford physician, appalled by what he witnessed, describes Gaza as “something of a laboratory for observing an absence of dignity,” a condition that has “devastating” effects on physical, mental, and social wellbeing. “The constant surveillance from the sky, collective punishment through blockade and isolation, the intrusion into homes and communications, and restrictions on those trying to travel, or marry, or work make it difficult to live a dignified life in Gaza.” The Araboushim must be taught not to raise their heads.

There were hopes that the new Morsi government in Egypt, less in thrall to Israel than the western-backed Mubarak dictatorship, might open the Rafah crossing, the sole access to the outside for trapped Gazans that is not subject to direct Israeli control. There has been slight opening, but not much. Journalist Laila el-Haddad writes that the re-opening under Morsi, “is simply a return to status quo of years past: only Palestinians carrying an Israeli-approved Gaza ID card can use Rafah Crossing,” excluding a great many Palestinians, including el-Haddad’s family, where only one spouse has a card.

Furthermore, she continues, “the crossing does not lead to the West Bank, nor does it allow for the passage of goods, which are restricted to the Israeli-controlled crossings and subject to prohibitions on construction materials and export.” The restricted Rafah crossing does not change the fact that “Gaza remains under tight maritime and aerial siege, and continues to be closed off to the Palestinians’ cultural, economic, and academic capitals in the rest of the [occupied territories], in violation of US-Israeli obligations under the Oslo Accords.”

The effects are painfully evident. In the Khan Yunis hospital, the director, who is also chief of surgery, describes with anger and passion how even medicines are lacking for relief of suffering patients, as well as simple surgical equipment, leaving doctors helpless and patients in agony. Personal stories add vivid texture to the general disgust one feels at the obscenity of the harsh occupation. One example is the testimony of a young woman who despaired that her father, who would have been proud that she was the first woman in the refugee camp to gain an advanced degree, had “passed away after 6 months of fighting cancer aged 60 years. Israeli occupation denied him a permit to go to Israeli hospitals for treatment. I had to suspend my study, work and life and go to set next to his bed. We all sat including my brother the physician and my sister the pharmacist, all powerless and hopeless watching his suffering. He died during the inhumane blockade of Gaza in summer 2006 with very little access to health service. I think feeling powerless and hopeless is the most killing feeling that human can ever have. It kills the spirit and breaks the heart. You can fight occupation but you cannot fight your feeling of being powerless. You can't even dissolve that feeling.”

Disgust at the obscenity, compounded with guilt: it is within our power to bring the suffering to an end and allow the Samidin to enjoy the lives of peace and dignity that they deserve.

SOURCE: http://www.chomsky.info/articles/20121104.htm

13
Can Civilization Survive Capitalism?
Noam Chomsky
Alternet, March 5, 2013

The term "capitalism" is commonly used to refer to the U.S. economic system, with substantial state intervention ranging from subsidies for creative innovation to the "too-big-to-fail" government insurance policy for banks.

The system is highly monopolized, further limiting reliance on the market, and increasingly so: In the past 20 years the share of profits of the 200 largest enterprises has risen sharply, reports scholar Robert W. McChesney in his new book "Digital Disconnect."

"Capitalism" is a term now commonly used to describe systems in which there are no capitalists: for example, the worker-owned Mondragon conglomerate in the Basque region of Spain, or the worker-owned enterprises expanding in northern Ohio, often with conservative support -- both are discussed in important work by the scholar Gar Alperovitz.

Some might even use the term "capitalism" to refer to the industrial democracy advocated by John Dewey, America's leading social philosopher, in the late 19th century and early 20th century.

Dewey called for workers to be "masters of their own industrial fate" and for all institutions to be brought under public control, including the means of production, exchange, publicity, transportation and communication. Short of this, Dewey argued, politics will remain "the shadow cast on society by big business."

The truncated democracy that Dewey condemned has been left in tatters in recent years. Now control of government is narrowly concentrated at the peak of the income scale, while the large majority "down below" has been virtually disenfranchised. The current political-economic system is a form of plutocracy, diverging sharply from democracy, if by that concept we mean political arrangements in which policy is significantly influenced by the public will.

There have been serious debates over the years about whether capitalism is compatible with democracy. If we keep to really existing capitalist democracy -- RECD for short -- the question is effectively answered: They are radically incompatible.

It seems to me unlikely that civilization can survive RECD and the sharply attenuated democracy that goes along with it. But could functioning democracy make a difference?

Let's keep to the most critical immediate problem that civilization faces: environmental catastrophe. Policies and public attitudes diverge sharply, as is often the case under RECD. The nature of the gap is examined in several articles in the current issue of Daedalus, the journal of the American Academy of Arts and Sciences.

Researcher Kelly Sims Gallagher finds that "One hundred and nine countries have enacted some form of policy regarding renewable power, and 118 countries have set targets for renewable energy. In contrast, the United States has not adopted any consistent and stable set of policies at the national level to foster the use of renewable energy."

It is not public opinion that drives American policy off the international spectrum. Quite the opposite. Opinion is much closer to the global norm than the U.S. government's policies reflect, and much more supportive of actions needed to confront the likely environmental disaster predicted by an overwhelming scientific consensus -- and one that's not too far off; affecting the lives of our grandchildren, very likely.

As Jon A. Krosnick and Bo MacInnis report in Daedalus: "Huge majorities have favored steps by the federal government to reduce the amount of greenhouse gas emissions generated when utilities produce electricity. In 2006, 86 percent of respondents favored requiring utilities, or encouraging them with tax breaks, to reduce the amount of greenhouse gases they emit. Also in that year, 87 percent favored tax breaks for utilities that produce more electricity from water, wind or sunlight [ These majorities were maintained between 2006 and 2010 and shrank somewhat after that.

The fact that the public is influenced by science is deeply troubling to those who dominate the economy and state policy.

One current illustration of their concern is the "Environmental Literacy Improvement Act" proposed to state legislatures by ALEC, the American Legislative Exchange Council, a corporate-funded lobby that designs legislation to serve the needs of the corporate sector and extreme wealth.

The ALEC Act mandates "balanced teaching" of climate science in K-12 classrooms. "Balanced teaching" is a code phrase that refers to teaching climate-change denial, to "balance" mainstream climate science. It is analogous to the "balanced teaching" advocated by creationists to enable the teaching of "creation science" in public schools. Legislation based on ALEC models has already been introduced in several states.

Of course, all of this is dressed up in rhetoric about teaching critical thinking -- a fine idea, no doubt, but it's easy to think up far better examples than an issue that threatens our survival and has been selected because of its importance in terms of corporate profits.

Media reports commonly present a controversy between two sides on climate change.

One side consists of the overwhelming majority of scientists, the world's major national academies of science, the professional science journals and the Intergovernmental Panel on Climate Change.

They agree that global warming is taking place, that there is a substantial human component, that the situation is serious and perhaps dire, and that very soon, maybe within decades, the world might reach a tipping point where the process will escalate sharply and will be irreversible, with severe social and economic effects. It is rare to find such consensus on complex scientific issues.

The other side consists of skeptics, including a few respected scientists who caution that much is unknown -- which means that things might not be as bad as thought, or they might be worse.

Omitted from the contrived debate is a much larger group of skeptics: highly regarded climate scientists who see the IPCC's regular reports as much too conservative. And these scientists have repeatedly been proven correct, unfortunately.

The propaganda campaign has apparently had some effect on U.S. public opinion, which is more skeptical than the global norm. But the effect is not significant enough to satisfy the masters. That is presumably why sectors of the corporate world are launching their attack on the educational system, in an effort to counter the public's dangerous tendency to pay attention to the conclusions of scientific research.

At the Republican National Committee's Winter Meeting a few weeks ago, Louisiana Gov. Bobby Jindal warned the leadership that "We must stop being the stupid party ... We must stop insulting the intelligence of voters."

Within the RECD system it is of extreme importance that we become the stupid nation, not misled by science and rationality, in the interests of the short-term gains of the masters of the economy and political system, and damn the consequences.

These commitments are deeply rooted in the fundamentalist market doctrines that are preached within RECD, though observed in a highly selective manner, so as to sustain a powerful state that serves wealth and power.

The official doctrines suffer from a number of familiar "market inefficiencies," among them the failure to take into account the effects on others in market transactions. The consequences of these "externalities" can be substantial. The current financial crisis is an illustration. It is partly traceable to the major banks and investment firms' ignoring "systemic risk" -- the possibility that the whole system would collapse -- when they undertook risky transactions.

Environmental catastrophe is far more serious: The externality that is being ignored is the fate of the species. And there is nowhere to run, cap in hand, for a bailout.

In future, historians (if there are any) will look back on this curious spectacle taking shape in the early 21st century. For the first time in human history, humans are facing the significant prospect of severe calamity as a result of their actions -- actions that are battering our prospects of decent survival.

Those historians will observe that the richest and most powerful country in history, which enjoys incomparable advantages, is leading the effort to intensify the likely disaster. Leading the effort to preserve conditions in which our immediate descendants might have a decent life are the so-called "primitive" societies: First Nations, tribal, indigenous, aboriginal.

The countries with large and influential indigenous populations are well in the lead in seeking to preserve the planet. The countries that have driven indigenous populations to extinction or extreme marginalization are racing toward destruction.

Thus Ecuador, with its large indigenous population, is seeking aid from the rich countries to allow it to keep its substantial oil reserves underground, where they should be.

Meanwhile the U.S. and Canada are seeking to burn fossil fuels, including the extremely dangerous Canadian tar sands, and to do so as quickly and fully as possible, while they hail the wonders of a century of (largely meaningless) energy independence without a side glance at what the world might look like after this extravagant commitment to self-destruction.

This observation generalizes: Throughout the world, indigenous societies are struggling to protect what they sometimes call "the rights of nature," while the civilized and sophisticated scoff at this silliness.

This is all exactly the opposite of what rationality would predict -- unless it is the skewed form of reason that passes through the filter of RECD.

SOURCE: http://www.chomsky.info/articles/20130305.htm

14
De-Americanizing the World
Noam Chomsky
Truthout, November 5, 2013

During the latest episode of the Washington farce that has astonished a bemused world, a Chinese commentator wrote that if the United States cannot be a responsible member of the world system, perhaps the world should become "de-Americanized" -- and separate itself from the rogue state that is the reigning military power but is losing credibility in other domains.

The Washington debacle's immediate source was the sharp shift to the right among the political class. In the past, the U.S. has sometimes been described sardonically -- but not inaccurately -- as a one-party state: the business party, with two factions called Democrats and Republicans.

That is no longer true. The U.S. is still a one-party state, the business party. But it only has one faction: moderate Republicans, now called New Democrats (as the U.S. Congressional coalition styles itself).

There is still a Republican organization, but it long ago abandoned any pretense of being a normal parliamentary party. Conservative commentator Norman Ornstein of the American Enterprise Institute describes today's Republicans as "a radical insurgency -- ideologically extreme, scornful of facts and compromise, dismissive of the legitimacy of its political opposition": a serious danger to the society.

The party is in lock-step service to the very rich and the corporate sector. Since votes cannot be obtained on that platform, the party has been compelled to mobilize sectors of the society that are extremist by world standards. Crazy is the new norm among Tea Party members and a host of others beyond the mainstream.

The Republican establishment and its business sponsors had expected to use them as a battering ram in the neoliberal assault against the population -- to privatize, to deregulate and to limit government, while retaining those parts that serve wealth and power, like the military.

The Republican establishment has had some success, but now finds that it can no longer control its base, much to its dismay. The impact on American society thus becomes even more severe. A case in point: the virulent reaction against the Affordable Care Act and the near-shutdown of the government.

The Chinese commentator's observation is not entirely novel. In 1999, political analyst Samuel P. Huntington warned that for much of the world, the U.S. is "becoming the rogue superpower," seen as "the single greatest external threat to their societies."

A few months into the Bush term, Robert Jervis, president of the American Political Science Association, warned that "In the eyes of much of the world, in fact, the prime rogue state today is the United States." Both Huntington and Jervis warned that such a course is unwise. The consequences for the U.S. could be harmful.

In the latest issue of Foreign Affairs, the leading establishment journal, David Kaye reviews one aspect of Washington's departure from the world: rejection of multilateral treaties "as if it were sport."

He explains that some treaties are rejected outright, as when the U.S. Senate "voted against the Convention on the Rights of Persons with Disabilities in 2012 and the Comprehensive Nuclear-Test-Ban Treaty (CTBT) in 1999."

Others are dismissed by inaction, including "such subjects as labor, economic and cultural rights, endangered species, pollution, armed conflict, peacekeeping, nuclear weapons, the law of the sea, and discrimination against women."

Rejection of international obligations "has grown so entrenched," Kaye writes, "that foreign governments no longer expect Washington's ratification or its full participation in the institutions treaties create. The world is moving on; laws get made elsewhere, with limited (if any) American involvement."

While not new, the practice has indeed become more entrenched in recent years, along with quiet acceptance at home of the doctrine that the U.S. has every right to act as a rogue state.

To take a typical example, a few weeks ago U.S. special operations forces snatched a suspect, Abu Anas al-Libi, from the streets of the Libyan capital Tripoli, bringing him to a naval vessel for interrogation without counsel or rights. U.S. Secretary of State John Kerry informed the press that the actions are legal because they comply with American law, eliciting no particular comment.

Principles are valid only if they are universal. Reactions would be a bit different, needless to say, if Cuban special forces kidnapped the prominent terrorist Luis Posada Carriles in Miami, bringing him to Cuba for interrogation and trial in accordance with Cuban law.

Such actions are restricted to rogue states. More accurately, to the one rogue state that is powerful enough to act with impunity: in recent years, to carry out aggression at will, to terrorize large regions of the world with drone attacks, and much else.

And to defy the world in other ways, for example by persisting in its embargo against Cuba despite the long-term opposition of the entire world, apart from Israel, which voted with its protector when the United Nations again condemned the embargo (188-2) in October.

Whatever the world may think, U.S. actions are legitimate because we say so. The principle was enunciated by the eminent statesman Dean Acheson in 1962, when he instructed the American Society of International Law that no legal issue arises when the United States responds to a challenge to its "power, position, and prestige."

Cuba committed that crime when it beat back a U.S. invasion and then had the audacity to survive an assault designed to bring "the terrors of the earth" to Cuba, in the words of Kennedy adviser and historian Arthur Schlesinger.

When the U.S. gained independence, it sought to join the international community of the day. That is why the Declaration of Independence opens by expressing concern for the "decent respect to the opinions of mankind."

A crucial element was evolution from a disorderly confederacy to a unified "treaty-worthy nation," in diplomatic historian Eliga H. Gould's phrase, that observed the conventions of the European order. By achieving this status, the new nation also gained the right to act as it wished internally.

It could thus proceed to rid itself of the indigenous population and to expand slavery, an institution so "odious" that it could not be tolerated in England, as the distinguished jurist William Murray, Earl of Mansfield, ruled in 1772. Evolving English law was a factor impelling the slave-owning society to escape its reach.

Becoming a treaty-worthy nation thus conferred multiple advantages: foreign recognition, and the freedom to act at home without interference. Hegemonic power offers the opportunity to become a rogue state, freely defying international law and norms, while facing increased resistance abroad and contributing to its own decline through self-inflicted wounds.

SOURCE: http://www.chomsky.info/articles/20131105.htm

15
Journalism & Mass Communication / Article of Noam Chomsky
« on: March 10, 2015, 05:02:29 PM »
Foreword to Michael Albert,
Realizing Hope: Life Beyond Capitalism

Noam Chomsky
Zed Books, 2014

Throughout much of the world there is growing resistance to the severe harm that has resulted from the neoliberal policies of the past generation. Latin America has progressed farthest in overthrowing this harsh regime, in recent years largely freeing itself from the grip of Western imperial domination for the first time and beginning to confront some of its severe internal problems, though many remain, as revealed recently by the mass protests in Brazil. These protests are joined by many others throughout the world, responding to local attacks on elementary rights and sometimes challenging dominant institutions and seeking to develop alternatives, escaping their fetters. They join in the effort to “realize hope,” to build a better world, to develop structures and relationships that are essential to overcoming class, gender, racial, power, and other hierarchies that relegate the many to subordination and that allow the few to dominate. But how should these hopes be realized? That question has to be posed clearly, and answered to the extent that we can.
The task is fraught with risk. One can envision too much and in so doing exceed what anyone can now reasonably assert, an act of hubris that might close off rather than enrich creative initiatives and, even worse, usurp the rightful role of future citizens in determining their own lives and relations. However well-motivated, such blueprinting would threaten coercion rather than facilitate liberation.

Alternatively, one can praise values we all share but say too little about how they might be actualized and about the kinds of institutional features that would allow people to manage their own lives with dignity, solidarity, and equity. Realizing Hope, and I am now referring to the book, not the endeavor, carefully navigates this minefield of possible dangers. It aims to provide a worthy and viable vision that is much needed in the current climate of resistance, one that can inform, inspire, and generate shared programs without going beyond what we can sensibly envision and crossing the line to authoritarian prescription. It investigates a wide range of issues, including economy and polity, kinship and culture, international relations and ecology, and even journalism, science, and education, among other topics. It seeks to provide an outline for a wide-ranging exploration of long-term aims of resistance that will provide essential tools for movements seeking to bend the arc of history towards justice, to adapt Martin Luther King’s famous phrase.

It is surely necessary to resist oppression and pursue liberation — and also to advance towards realizing hope by gaining clarity about our objectives and constructing paths to attain them.

SOURCE: http://www.chomsky.info/articles/201409--.htm

Pages: [1] 2