Friday, February 8, 2013

How the world’s oceans could be running out of fish

By Gaia Vince
a science writer and broadcaster who is particularly interested in how humans are transforming planet Earth and the impacts our changes are having on societies and on other species. She has visited people and places around the world in a quest to understand how we are adapting to environmental change.

It has been some time since most humans lived as hunter-gatherers – with one important exception. Fish are the last wild animal that we hunt in large numbers. And yet, we may be the last generation to do so.

Entire species of marine life will never be seen in the Anthropocene (the Age of Man), let alone tasted, if we do not curb our insatiable voracity for fish. Last year, global fish consumption hit a record high of 17 kg (37 pounds) per person per year, even though global fish stocks have continued to decline. On average, people eat four times as much fish now than they did in 1950.

Around 85% of global fish stocks are over-exploited, depleted, fully exploited or in recovery from exploitation. Only this week, a report suggested there may be fewer than 100 cod over the age of 13 years in the North Sea between the United Kingdom and Scandinavia. The figure is still under dispute, but it’s a worrying sign that we could be losing fish old enough to create offspring that replenish populations.

Large areas of seabed in the Mediterranean and North Sea now resemble a desert – the seas have been expunged of fish using increasingly efficient methods such as bottom trawling. And now, these heavily subsidised industrial fleets are cleaning up tropical oceans too. One-quarter of the EU catch is now made outside European waters, much of it in previously rich West African seas, where each trawler can scoop up hundreds of thousands of kilos of fish in a day. All West African fisheries are now over-exploited, coastal fisheries have declined 50% in the past 30 years, according to the UN Food and Agriculture Organisation.

Catches in the tropics are expected to decline a further 40% by 2050, and yet some 400 million people in Africa and Southeast Asia rely on fish caught (mainly through artisanal fishing) to provide their protein and minerals. With climate change expected to impact agricultural production, people are going to rely more than ever on fish for their nutritional needs.

The policy of subsidising vast fishing fleets to catch ever-diminishing stocks is unsustainable. In Spain, for example, one in three fish landed is paid for by subsidy. Governments, concerned with keeping jobs alive in the fishing industry in the short-term, are essentially paying people to extinguish their own long-term job prospects – not to mention the effect on the next generation of fishermen. Artisanal fishing catches half the world’s fish, yet it provides 90% of the sector’s jobs.

Protect depletion

Clearly, industrialised countries are not about to return to traditional methods. However, the disastrous management of the industry needs to be reformed if we are to restore fisheries to a sustainable level. In the EU alone, restoring stocks would result in greater catches of an estimated 3.5 million tonnes, worth £2.7 billion a year.

Rather than having a system in which the EU members each hustle for the biggest quotas – which are already set far beyond what is sustainable – fisheries experts suggest individual governments should set quotas based on stock levels in their surrounding waters. Fishermen should be given responsibility over the fish they hunt – they have a vested interest in seeing stocks improve, after all – and this could be in the form of individual tradable catch shares of the quotas. Such policies end the tragedy of the commons situation whereby everyone grabs as much as they can from the oceans before their rival nets the last fish, and it’s been used successfully in countries from Iceland to New Zealand to the US. Research shows that managing fisheries in this way means they are twice as likely to avoid collapse as open-access fisheries.

In severely depleted zones, the only way to restore stocks is by introducing protected reserves where all fishing is banned. In other areas, quota compliance needs to be properly monitored – fishing vessels could be licensed and fitted with tracking devices to ensure they don’t stray into illegal areas, spot-checks on fish could be carried out to ensure size and species, and fish could even be tagged, so that the authorities and consumers can ensure its sustainable source.

The other option is to take humanity's usual method of dealing with food shortages, and move from hunter-gathering to farming.

Already, more than half of the fish we eat comes from farms – in China, it’s as high as 80% – but doing this on an industrial scale has its problems. Farms are stocked with wild fish, which must then be fed – larger fish like salmon and tuna eat as much as 20 times their weight in smaller fish like anchovies and herring. This has led to overfishing of these smaller fish, but if farmed fish are fed a vegetarian diet, they lack the prized omega-3 oils that make them nutritious, and they do not look or taste like the wild varieties. Scientists are working to create an artificial version of omega-3 – current synthetic omega-3 versions are derived from fish oils.

Fish farms are also highly polluting. They produce a slurry of toxic run-off – manure – which fertilises algae in the oceans, reducing the oxygen available to other species and creates dead zones. Scotland's salmon-farming industry, for example, produces the same amount of nitrogen waste as the untreated sewage of 3.2 million people – over half the country's population. As a result, there are campaigns to ban aquaculture from coastal areas.

Farmed fish are also breeding grounds for infection and parasites that kill off large proportions of fish – escapees then frequently infect wild populations. Farmers try to control infestations with antibiotics, but usually only succeed in creating a bigger problem of antibiotic resistance.

Dangerous predator

Humanity is not limiting its impacts to fish most commonly found on menus. Exotic sea creatures from turtles to manta ray to marine mammals are being hunted to extinction. Shark numbers, for example, have declined by 80% worldwide, with one-third of shark species now at risk of extinction. The top marine predator is no longer the shark, it’s us.

A decline in shark numbers has a significant impact on the marine ecosystem: it can lead to an increase in fish numbers further down the food chain, which in turn can cause a crash in the population of very small marine life, such as plankton. Without the smallest creatures, the entire system is threatened.

One of the repercussions, which I have discussed before, is an increase in jellyfish numbers, but overfishing, pollution, climate change and acidification also affect the marine ecosystem. Warmer waters are pushing species into different habitats, causing some to die off and others to adapt by creating entirely new hybrid species. Meanwhile, trawlers are netting bycatch that include marine mammals and even seabirds – as many as 320,000 seabirds are being killed annually when they get caught in fishing lines, pushing populations of albatrosses, petrels and shearwaters to the edge of extinction.

Some solutions are easier than you might think. Seabirds can be protected by using weighted lines and scaring off birds with lines that have flapping streamers attached – these methods alone have reduced seabird deaths by more than 85-99% where they are used.

Conservation plea

Strengthening and expanding protected marine reserves would also go a long way to conserving species. Currently, less than 1% of the ocean is protected, although by 2020, the international community has agreed to raise this to 10%. Reserves, when properly patrolled and monitored, do protect marine life, and nation after nation is stepping up to the plate. The tiny Pacific islands have banded together to create a giant protected area of 1.1 million square kilometres, for example. Not to be outdone, Australia has created the world’s biggest protected area, and countries around the world from Britain to New Zealand are joining the effort.

But useful as they are, marine reserves – often around points like coral reefs or rock islands – are only effective if governments have the resources to patrol and protect them. Also, many marine creatures, from whale sharks to whales, are migratory – they don’t stay in the protected areas, making them easy prey for fishermen. What’s needed, many argue, are mobile reserves that follow migratory animals, and those that shift habitat due to currents or climate phenomena like El Nino.

The zones need to be well-targeted and needn’t impact on fishermen’s livelihoods. For example, one study found that designating just 20 sites – 4% of the world’s oceans – as conservation zones could protect 108 species (84%) of the world’s marine mammals.

The rivers in many European cities were so overfished, polluted and dammed up by the mid 20th century that they emptied of fish, and many species went locally extinct. But thanks to clean-ups, riverbank restoration and fishing restrictions, fish are returning to waterways, even in inner cities. A decade ago, few people would have imagined that salmon would return to my local river, the Thames. If it is possible to bring back fish to 'dead' rivers, there is surely hope for the world's oceans.


Will we ever… travel faster than the speed of light?

By Jennifer Ouellette,
an award-winning science writer whose work has appeared in Discover, New Scientist, Nature, and Physics World, among other venues.

Einstein said it is impossible, but as Jennifer Ouellette explains some scientists are still trying to break the cosmic speed limit – even if it means bending the laws of physics.

Last summer, a small neutrino experiment in Europe called OPERA (Oscillation Project with Emulsion tRacking Apparatus) stunned the world with a preliminary announcement that it had clocked neutrinos travelling just a few fractions of a second faster than the speed of light. The news even briefly overshadowed the far more recognizable Large Hadron Collider’s ongoing hunt for the Higgs boson.

Despite careful hedging by scientists, the popular imagination jumped right from neutrinos to a viable spacecraft for fast interstellar travel. After all, the prospect of faster-than-light (FTL) travel has been a science fiction staple for decades, from wormholes and Star Trek’s original warp drive, to the FTL “jumps” used to evade the Cylons in SyFy’s Battlestar Galactica reboot. It takes years, decades, centuries even to cross the vast expanses of space with our current propulsion technology – a realistic depiction of the tedium of space travel in entertainment would likely elicit the viewer equivalent of “Are we there yet?”

So the OPERA announcement was bound to generate excitement, even if the neutrinos in question were only moving nanoseconds faster than light – hardly sufficient to outrun the Cylons, but nevertheless faster than c, the cosmic speed limit set by Albert Einstein back in 1905.

Unfortunately, the euphoria was premature: the OPERA results were incorrect, thanks to a calibration error. The culprit: a faulty cable connection in the GPS system used to time the neutrinos along their journey. That killjoy Einstein wins again.

But if the OPERA saga did tell us anything, it’s that the idea of travelling faster than light continues to capture the imagination. As Hollywood screenwriter Zack Stentz (Thor, a.k.a. “Vikings in Space”) said recently at a Los Angeles panel on the science of superheroes, “Every science fiction writer who wants to get out of the solar system [within a human lifetime] gloms onto that. It’s the leap of faith that lets you tell stories on this bigger canvas.”

“You cannae change the laws of physics”

“Leap of faith” is a particularly relevant phrase to use here. The fact is we’ll never be able to travel beyond the speed of light, at least based on our current understanding of established physics.

As any object with mass accelerates – like a proton in the LHC – it gains energy, always needing just a little bit more energy to accelerate even further. The LHC, the largest and highest-energy particle accelerator we have, boosts protons as close to the speed of light as we can get, but they never quite hit the mark. If a proton did achieve that speed, it would need infinite energy to go any faster, and we don’t have an infinite supply of energy.

Equations don’t tend to lie, especially ones that have been tested and re-tested in countless experiments for over a century. For all practical intents and purposes, the speed of light is an insurmountable threshold.

But physicists would never make any progress at all if they threw in the towel quite that easily, and nobody thinks Einstein will have the final word in perpetuity. Many scientists are happy to consider the possibility of violations of relativistic principles, even if none have yet been experimentally confirmed.

One of the earliest proposed possibilities for FTL travel involved a hypothetical particle called a tachyon, capable of tunnelling past the speed of light barrier. This turned out to be more of a mathematical artifact rather than an actual physical particle.

However, another reason for all the OPERA-tic excitement was that back in 1985, physicists proposed that some high-energy neutrinos might really be tachyons, capable of interacting with an as-yet-known field, giving them just enough of an energy boost to break through the barrier. Such tachyon-like neutrinos would supersede photons as the fastest particles in the universe.

OPERA’s calibration error dashed those hopes, but there are still plenty of potential loopholes to be explored, such as the Star Trek-inspired warp drive mechanism first proposed by Mexican physicist Miguel Alcubierre in 1994. In general relativity, spacetime is dynamic, not static, warping and bending in response to the presence of mass or energy. Alcubierre suggested that it might be possible to encase a spaceship within a “warp bubble”, whereby space contracted in front of the craft and expanded behind it, enabling it to travel faster than light. But within that bubble, spacetime would remain essentially flat and the craft would technically “obey” the cosmic speed limit.

Alas, once again we face an energy problem: achieving that degree of curvature would require enormous amounts of energy – and negative energy at that – equivalent to the mass of Jupiter. To propel a spacecraft across the Milky Way galaxy may require more energy than can be found in the mass of the entire universe. A more energy-efficient ring-shaped design for such a warp drive was described recently at a symposium on interstellar space flight, offering a meager shred of hope to diehard space acolytes that for future generations, warp drive will be a reality.

However, given what we know about general relativity and quantum field theory, “It almost certainly can’t be done,” says Ken Olum, a cosmologist at Tufts. “Of course, if we are talking about quantum gravity, it’s hard to know, because we don’t really know what that is.”

Former Nasa scientist Kevin Grazier, who was the technical consultant for Battlestar Galactica, says that a version of the Alcubierre warp drive inspired the “jump drive” used in that series. It was based on the assumption that, in this fictional world, the Colonials had merged theories of electromagnetism and gravity, such that if you could create a very intense electromagnetic field, it would be functionally equivalent to an intense gravitational field capable of warping spacetime. Turning that ingenious fiction into a viable reality is another matter altogether.

Brane gain

If we really want to get speculative, Olum suggests FTL travel would be possible if exotic concepts, like those that emerge from superstring theory, prove to be correct.

We inhabit four-dimensional spacetime, but various permutations of superstring theory suggest our universe is just one of many, co-existing within a bubble of five-dimensional spacetime called the “bulk.” Within that bulk, our universe lines up in parallel with all the others, just like the pages in a book. Olum explains that, hypothetically, one could take a shortcut through the bulk, thereby arriving at your destination sooner than if you had travelled along your four-dimensional surface, or brane (short for membrane) as it is known.

Even then, there is a catch. “In brane theories, only gravitons can travel through the bulk,” says Olum. So one would need to invent a machine that could scan an object and transmit the information in the form of gravitons to a second machine on the other end which would then reconstruct that object – shades of teleportation, only with gravitons.

Considering we have yet to observe gravitons in our most powerful accelerators, and the current record for teleporting small clouds of atoms is the relatively non-Cylon-troubling distance of 143 kilometres (88 miles), this scenario must also remain firmly in the realm of science fiction, at least for now. Science advances, but it does so slowly, at a pace nowhere near the speed of light.

Will we ever… simulate the human brain?

By Ed Yong BBC

A billion dollar project claims it will recreate the most complex organ in the human body in just 10 years. But detractors say it is impossible. Who is right?

For years, Henry Markram has claimed that he can simulate the human brain in a computer within a decade. On 23 January 2013, the European Commission told him to prove it. His ambitious Human Brain Project (HBP) won one of two ceiling-shattering grants from the EC to the tune of a billion euros, ending a two-year contest against several other grandiose projects. Can he now deliver? Is it even possible to build a computer simulation of the most powerful computer in the world – the 1.4-kg (3 lb) cluster of 86 billion neurons that sits inside our skulls?

The very idea has many neuroscientists in an uproar, and the HBP’s substantial budget, awarded at a tumultuous time for research funding, is not helping. The common refrain is that the brain is just too complicated to simulate, and our understanding of it is at too primordial a stage.

Then, there’s Markram’s strategy. Neuroscientists have built computer simulations of neurons since the 1950s, but the vast majority treat these cells as single abstract points. Markram says he wants to build the cells as they are – gloriously detailed branching networks, full of active genes and electrical activity. He wants to simulate them down to their ion channels – the molecular gates that allow neurons to build up a voltage by shuttling charged particles in and out of their membrane borders. He wants to represent the genes that switch on and off inside them. He wants to simulate the 3,000 or so synapses that allow neurons to communicate with their neighbours.

Erin McKiernan, who builds computer models of single neurons, is a fan of this bottom-up approach. “Really understanding what’s happening at a fundamental level and building up – I generally agree with that,” she says. “But I tend to disagree with the time frame. [Markram] said that in 10 years, we could have a fully simulated brain, but I don’t think that’ll happen.”

Even building McKiernan’s single-neuron models is a fiendishly complicated task. “For many neurons, we don’t understand well the complement of ion channels within them, how they work together to produce electrical activity, how they change over development or injury,” she says. “At the next level, we have even less knowledge about how these cells connect, or how they’re constantly reaching out, retracting or changing their strength.” It’s ignorance all the way down.

“For sure, what we have is a tiny, tiny fraction of what we need,” says Markram. Worse still, experimentally mapping out every molecule, cell and connection is completely unfeasible in terms of cost, technical requirements and motivation. But he argues that building a unified model is the only way to unite our knowledge, and to start filling in the gaps in a focused way. By putting it all together, we can use what we know to predict what we don’t, and to refine everything on the fly as new insights come in.

Network construction

The crucial piece of information, and the one Markram’s team is devoting the most time towards, is a complete inventory of which genes are active in which neurons. Neurons aren’t all the same – they come in a variety of types that perform different roles and deploy different genes. Once Markram has the full list – the so-called “single-cell transcriptome” – he is confident that he can use it to deduce the blend of different neurons in various parts of the brain, recreate the electrical behaviour of each type of cell, or even simulate how a neuron’s branches would grow from scratch. “We’re discovering biological principles that are putting the brain together,” he says.

For over two decades, his team have teased out the basic details of a rat’s neurons, and produced a virtual set of cylindrical brain slices called cortical columns. The current simulation has 100 of these columns, and each has around 10,000 neurons – less than 2% of a rat’s brain and just over 0.001% of ours. “You have to practice this first with rodents so you’re confident that the rules apply, and do spot checks to show that these rules can transfer to humans,” he says.

Eugene Izhikevich from the Brain Corporation, who helped to build a model with 100 billion neurons, is convinced that we should be able to build a network with all the anatomy and connectivity of a real brain. An expert could slice through it and not tell the difference. “It’d be like a Turing test for how close the model would be to the human brain,” he says.

But that would be a fantastic simulation of a dead brain in an empty vat. A living one pulses with electrical activity – small-scale currents that travel along neurons, and large waves that pass across entire lobes. Real brains live inside bodies and interact with environments. If we could simulate this dynamism, what would emerge? Learning? Intelligence? Consciousness?

“People think I want to build this magical model that will eventually speak or do something interesting,” says Markram. “I know I’m partially to blame for it – in a TED lecture, you have to speak in a very general way. But what it will do is secondary. We’re not trying to make a machine behave like a human. We’re trying to organise the data.”

Function first

That worries neuroscientist Chris Eliasmith from the University of Waterloo in Ontario, Canada. “The project is impressive but might leave people baffled that someone would spend a lot of time and effort building something that doesn’t do anything,” he says. Markram’s isn’t the only project to do this. Last November, IBM presented a brain simulation called SyNAPSE, which includes 530 billion neurons with 100 trillion synapses connecting them, and does... not very much. It’s basically a big computer. It still needs to be programmed. “Markram would complain that those neurons aren’t realistic enough, but throwing a ton of neurons together and approximately wiring them according to biology isn’t going to bridge this gap,” says Eliasmith.

Eliasmith has taken a completely different approach. He is putting function first. Last November, he unveiled a model called Spaun, which simulates a relatively paltry 2.5 million neurons but shows behaviour. It still simulates the physiology and wiring of the individual neurons, but organises them according to what we know about the brain’s architecture. It’s a top-down model, as well as a bottom-up one, and sets the benchmark for brain simulations that actually do something. It can recognise and copy lists of numbers, carry out simple arithmetic, and solve basic reasoning problems. It even makes errors in the same way we do – for example, it’s more likely to remember items at the start and end of a list.

But the point of Spaun is not to build an artificial brain either. It’s a test-bed for neuroscience – a platform that we can use to understand how the brain works. Does Region X control Function Y? Build it and see if that’s true. If you knock out Region X, will Spaun’s mental abilities suffer in a predictable way? Try it.

This kind of experiment will be hard to do with the HBP’s bottom-up architecture. Even if that simulation shows properties like intelligence, it will be difficult to understand where those came from. It won’t be a simple matter of tweaking one part of the simulation and seeing what happens. If you are trying to understand the brain and you do a really good simulation, the problem is that you end up with... the brain. And the brain is very complicated.

Besides, Izhikevich points out that technology is quickly outpacing many of the abilities that our brains are good at. “I can do arithmetic better on a calculator. A computer can play chess better than you,” he says. By the time a brain simulation is sophisticated enough to reproduce brain’s full repertoire of behaviour, other technologies will be able to do the same things faster and better, and “the problem won’t be interesting anymore,” says Izhikevich.

So, simulating a brain isn’t a goal in itself. It’s an end to some means. It’s a way of organising tools, experts, and data. “Walking the path is the most important part,” says Izhikevich.

Wednesday, February 6, 2013

E-commerce in Greece - The right side of the Styx?

JEFF BEZOS founded Amazon in 1994. Apostolos Apostolakis and his mates started e-shop.gr, Greece’s biggest online retailer, just four years later. The comparisons end there. The Seattle juggernaut’s annual sales grow at double-digit rates; e-shop’s have been savaged by Greece’s depression. Amazon made its name selling books. E-shop was stymied by regulated book prices and shifted early into electronics. The Americans have indulgent shareholders while the Greeks were nearly undone by skimpy equity.

Economic woes aside, Greece is tough terrain for online shopping. Less than half of Greeks are regular internet users compared with two-thirds of Europeans overall. More than 40% of Europeans shop online but fewer than 20% of Greeks do. Broadband connections are sparser and consumers are warier. Most refuse to submit credit-card details on line, preferring to pay cash on delivery. Islands make Greece an obstacle course for couriers.

E-shop found clever fixes. It has a fleet of 50 trucks to make deliveries and collect cash (Amazon relies mainly on outsiders for last-mile logistics). Unusually for an e-tailer, it has a network of 52 shops. These do not hold stock. They are another channel for accepting payment and avoid the cost of shipping to a customer’s house. They also serve to advertise the e-shop brand.

This ingenuity did not spare e-shop the ravages of Greece’s economic calamity, which struck just as the investment in the shop network was completed. Sales dropped from a peak of €128m ($169m) in 2009 to €46m in 2011. Unlike a typical Silicon Valley startup, e-shop was not nurtured by a venture-capital fund and passed up a chance to be bought when times were good. When the crisis hit, banks cut credit. Without the working capital needed to hold inventory, e-shop was forced to stretch out delivery times. Its sales dropped by more than those of competitors such as Kotsovolos, which is owned by Dixons, a British electronics merchant.

The worst may be over. E-shop filed for protection from its creditors, which has eased its working-capital squeeze. Now more than half of orders are delivered the next day. It has slashed costs, partly by paring back its bricks-and-mortar network. That is a prelude to a hoped-for debt reduction in 2013. Mr Apostolakis sees signs that Greeks are warming to internet shopping. Online air tickets are popular, and that is getting consumers used to using credit cards. Internet retailing is growing at double the European rate. In 2012 e-shop’s turnover recovered to €60m. Maybe someday the Amazon analogy will not seem far fetched.

Χωρις λογια



Tuesday, February 5, 2013

Hackers hit U.S. Department of Energy

The U.S. Department of Energy has confirmed that its computer systems were hacked into last month. According to The New York Times, the federal agency sent around an internal e-mail on Friday telling its employees about the cyberattack.

"The Department of Energy has just confirmed a recent cyber incident that occurred in mid-January which targeted the Headquarters' network and resulted in the unauthorized disclosure of employee and contractor Personally Identifiable Information," the e-mail said.
The agency said that it is working to figure out the "nature and scope of the incident" but that so far it believes "no classified data was compromised." It's unclear which divisions within the Department of Energy were attacked or who was behind the hack.

The Department of Energy is in charge of much of the country's vital infrastructure, such as energy production, nuclear reactor production, and radioactive waste disposal. It has troves of classified and sensitive data that if leaked could be detrimental to the country's security. According to Reuters, the most highly classified information is stored on networks that aren't connected to the Internet.

The head of Homeland Security Janet Napolitano recently announced that she believes a wave of cyberattacks on U.S. infrastructure is a serious possibility. Dubbing such an event a "cyber 9/11," Napolitano warned that cyberterrorists could take down the nation's power grid, water infrastructure, transportation networks, and financial networks.

While it doesn't seem like the January cyberattack on the Department of Energy compromised any data or infrastructure, it does show that hackers were able to breach the government's computer systems. In the e-mail, the agency said it is working to fortify itself against future attacks.

"Once the full nature and extent of this incident is known, the Department will implement a full remediation plan," the e-mail said. "The Department is also leading an aggressive effort to reduce the likelihood of these events occurring again. These efforts include leveraging the combined expertise and capabilities of the Department's Joint Cybersecurity Coordination Center to address this incident, increasing monitoring across all of the Department's networks and deploying specialized defense tools to protect sensitive assets."

by
 

Monday, February 4, 2013

iPad-NMControl - New product announcement by nuova marea ltd

iPad-NMControl for Yachting applications

Control your existing audio/visual system, lighting, blinders and more with your iPad.

Operate complicated systems with a simple and user friendly application.

No need to upgrade your installation.

With a series of low cost interfaces take control of your lighting and existing hardware.

Custom design for every boat.

for more contact sales@nuovamarea.com