The US Navy has developed a radical new fuel made from seawater.
They say it could change the way we produce fuel – and allow warships to stay at sea for years at a time.
Navy scientists have spent several years developing the process to take seawater and use it as fuel, and have now used the ‘game changing’ fuel to power a radio controlled plane in the first test.
The development of a liquid hydrocarbon fuel is being hailed as ‘a game-changer’ because it would allow warships to remain at sea for far longer.
The US has a fleet of 15 military oil tankers, and only aircraft carriers and some submarines are equipped with nuclear propulsion.
All other vessels must frequently abandon their mission for a few hours to navigate in parallel with the tanker, a delicate operation, especially in bad weather.
The ultimate goal is to eventually get away from the dependence on oil altogether, which would also mean the navy is no longer hostage to potential shortages of oil or fluctuations in its cost.
The predicted cost of jet fuel using these technologies is in the range of $3-$6 per gallon, and with sufficient funding and partnerships, this approach could be commercially viable within the next seven to ten years.
Pursuing remote land-based options would be the first step towards a future sea-based solution, the Navy says.
Vice Admiral Philip Cullom declared: ‘It’s a huge milestone for us.
‘We are in very challenging times where we really do have to think in pretty innovative ways to look at how we create energy, how we value energy and how we consume it.
‘We need to challenge the results of the assumptions that are the result of the last six decades of constant access to cheap, unlimited amounts of fuel,’ added Cullom.
‘Basically, we’ve treated energy like air, something that’s always there and that we don’t worry about too much.
‘But the reality is that we do have to worry about it.’
They hope the fuel will not only be able to power ships, but also planes.
The predicted cost of jet fuel using the technology is in the range of three to six dollars per gallon, say experts at the US Naval Research Laboratory, who have already flown a model airplane with fuel produced from seawater.
Dr Heather Willauer, an research chemist who has spent nearly a decade on the project, said:
‘For the first time we’ve been able to develop a technology to get CO2 and hydrogen from seawater simultaneously, that’s a big breakthrough,’ she said, adding that the fuel ‘doesn’t look or smell very different.’
Now that they have demonstrated it can work, the next step is to produce it in industrial quantities.
But before that, in partnership with several universities, the experts want to improve the amount of CO2 and hydrogen they can capture.
‘We’ve demonstrated the feasibility, we want to improve the process efficiency,’ explained Willauer.
Collum is just as excited.
‘For us in the military, in the Navy, we have some pretty unusual and different kinds of challenges,’ he said.
‘We don’t necessarily go to a gas station to get our fuel, our gas station comes to us in terms of an oiler, a replenishment ship.
‘Developing a game-changing technology like this, seawater to fuel, really is something that reinvents a lot of the way we can do business when you think about logistics, readiness.’
A crucial benefit, says Collum, is that the fuel can be used in the same engines already fitted in ships and aircraft.
‘If you don’t want to reeengineer every ship, every type of engine, every aircraft, that’s why we need what we call drop-in replacement fuels that look, smell and essentially are the same as any kind of petroleum-based fuels.’
Drawbacks? Only one, it seems: researchers warn it will be at least a decade before US ships are able to produce their own fuel on board.
More consequences of Obamacare – this time potentially killing an outpatient substance abuse service that many people rely on:
The basic principle behind hydrogen fuel cells is fairly simple: Hydrogen atoms are stripped of their electrons to generate electricity and then combined with oxygen to form water as a by-product. Mainstream deployment of fuel-cell vehicles, though, has proved to be complex. Compared with liquid fuels, hydrogen is tough to transport and store. And without a meaningful number of vehicles on the road, there’s been no incentive to build hydrogen fuel infrastructure. Now new initiatives in California and across the U.S. are pushing for a long-awaited expansion of the refueling network. And with the debut of three promising hydrogen-fuel-cell vehicles from Honda, Hyundai, and Toyota, consumers will have new options beginning in 2014. Are we finally seeing the dawn of the hydrogen age? Not so fast.
The current hydrogen push has less to do with consumer demand than with government incentives that treat fuel-cell vehicles (FCV) as equal to or better than electric vehicles. In California the combination of 300-mile range and fast refueling gives fuel cells the maximum available zero-emission vehicle (ZEV) credits. That makes it easy for a manufacturer to meet the state’s ZEV mandate with fewer cars. On the federal level, both FCVs and EVs get an EPA credit multiplier of 2.0 beginning in 2017, which means that sales of either type of car confer a disproportionate benefit on the ledger for an automaker’s entire fleet. In response, manufacturers have formed several high-profile partnerships, including Ford/Daimler/Renault-Nissan, BMW/Toyota, and GM/Honda to develop the vehicles. On the fueling side, a recent infusion of $20 million of funding per year has expanded the California Fuel Cell Partnership’s plan to 100 statewide refueling stations. The Department of Energy’s H2USA organization wants to use California’s efforts as a blueprint for the rest of the nation.
CAN I BUY A FUEL-CELL CAR?
In the past, fuel-cell vehicles have only been available in the hundreds. The three new FCVs slated for production this year and next will increase the volume to thousands, but they will be available primarily in California, where most of the country’s hydrogen stations exist. According to Alan Baum, an automotive analyst at Baum and Associates, even if the stations proliferate, fuel-cell vehicles, like EVs, won’t dominate the market. “It’s not going to be a widespread technology, and for that matter it doesn’t need to be,” he says. “We’re doing an all-hands-on-deck strategy.”
ARE THE PRACTICAL?
Not according to Tesla and SpaceX founder Elon Musk, who says fuel cells are more of a marketing ploy than a realistic solution. Nissan CEO Carlos Ghosn agrees: “Knowing all the problems we have with charging [EVs], where is the hydrogen infrastructure?” Both men have a bias toward electric vehicles, but the infrastructure issue is a big one. With the current cost of a hydrogen filling station at more than $1 million, neither the government nor the corporate world has any plans for a rapid expansion of the filling network. “We’ve got electricity everywhere,” Baum says. “Putting in 240-volt charging units requires some effort and expense, but it’s not game changing. Putting in hydrogen is.”
WHERE DOES THE POWER COME FROM?
Here’s the abridged version: Compressed hydrogen from the storage tank (A) is stripped of its electrons in the fuel-cell stack (B), creating electricity. A power-control unit (C) orchestrates the flow of energy from the stack to the battery (D), which powers the electric motor that moves the car. The battery ensures full power during acceleration until the fuel cell reaches peak voltage. Got all that?
ARE THEY SAFE?
Yes. Stringent requirements established by the Department of Transportation (DOT) and Society of Automotive Engineers (SAE) ensure that the technology is safe. Automakers are required to build robust hydrogen storage tanks that not only hold the fuel at up to 10,000 psi but also withstand arcane-sounding trials such as “bonfire” and “gunshot” tests by the DOT. Tanks are usually made of several layers of carbon fiber wrapped around aluminum or polyethylene liners, and many are also protected by external layers of steel. Regulations covering PRDs (pressure-relief devices) govern both temperatures and pressures at which gas is released, typically well below what is standard for safe operating conditions.
HOW GREEN ARE FUEL CELLS?
It depends on where you look. The only tailpipe emission from an FCV is water, but the process of creating hydrogen fuel – just like that of formulating gasoline or generating current for an electric vehicle – has an environmental impact. More than 90 percent of hydrogen today is created using a natural-gas-reforming process involving steam and methane, which reduces CO2 emissions from “well to wheel” by approximately 60 percent, compared with the process of creating gasoline. So, carbon dioxide is still released into the atmosphere – it just happens before the liquid hydrogen gets to your tank. Incentives and mandates encourage a cleaner hydrogen-creation process: The state of California requires that 30 percent of H2 supplied for transportation come from renewable sources, which can include wind, solar, and biomass material.
WHAT ABOUT REFUELING?
One advantage of FCVs is that they can travel farther and restore range faster than most current EVs. Refueling is simple: Once a nozzle with a snap collar is securely mated and locked to your car, the transfer of hydrogen begins with a brief hissing sound, followed by a 3- to 5- minute fill-up. However, it takes considerably longer for a filling station to restore the pressure required to service the next vehicle, so current setups can only refuel six or so cars per hour.
SO, IS HYDROGEN HAPPENING?
“When you have several major carmakers saying we’re going to invest in this, that’s significant,” Baum says. But vehicles are just one piece of the puzzle. Every other player in the hydrogen supply chain, such as the service station industry, needs to invest heavily. Until then, refueling options and vehicle choices will remain extremely limited, with no guarantee of expansion. Which is to say that hydrogen-fuel-cell cars will be a minor footnote in terms of overall vehicle sales for the foreseeable future. For all but the earliest of adopters, hydrogen as a prominent fuel alternative remains somewhere on the horizon.
The world’s largest bitcoin trading exchange shut down on Tuesday, sparking a massive sell-off that calls into question the long-term viability of the nascent virtual currency trade.
“This is extremely destructive,” risk-management expert and former Federal Reserve Bank Examiner Mark Williams told the Los Angeles Times. “What we’re seeing is a lot of the flaws. It’s not only fragile, it’s fragile as eggshells.”
The halt in trading occurred when reports hit the Internet that the Tokyo-based Mt. Gox bitcoin exchange suffered the theft of 744,000 bitcoins worth an estimated $380 million.
Internet currency forums are now asking the question whether “bitcoin” has morphed into “shitcoin.”
Others expressed optimism that the crisis will spawn better measures.
“I think it’s a significant event, but I think there’s a decent chance that it is part of what we would call this sort of shaking out of the industry as it matures and slowly becomes a little more regulated,” New York state’s top financial regulator Benjamin M. Lawsky told the New York Times.
The electricity price index soared to a new high in January 2014 with the largest month-to-month increase in almost four years, according to the Bureau of Labor Statistics.
Meanwhile, data from the Energy Information Administration, a division of the U.S. Department of Energy, indicates that electricity production in the United States has declined since 2007, when it hit its all-time peak.
The U.S. is producing less electricity than it did seven years ago for a population that has added more than 14 million people.
“The electricity index rose 1.8 percent, its largest increase since March 2010,” said BLS in its summary of the Consumer Price Index released Thursday.
In December, the seasonally adjusted electricity index was 203.740. In January, it climbed to a new high of 207.362.
Back in January 2013, the electricity price index stood at 198.679. It thus climbed about 4.4 percent over the course of a year.
Last month, the average price for a kilowatthour (KWH) of electricity in a U.S. city also hit an all-time January high of 13.4 cents, according to BLS. That marks the first time the average price for a KWH has ever exceeded 13 cents in the month of January, when the price of electricity is normally lower than in the summer months.
A year ago, in January 2013, a KWH cost 12.9 cents. The increase in the price of a KWH from January 2013 to January 2014 was about 3.9 percent.
During the year, the price of a KWH of electricity usually rises in the spring, peaks in summer, declines in fall, and is at its lowest point in winter. In 2013, the average price of a KWH in each of the 12 months of the year set a record for that particular month. January 2014′s price of 13.4 cents per KWH set a new record for January.
Historically, in the United States, rising electricity prices have not been inevitable. In the first decades after World War II, the U.S. rapidly increased it electricity production, including on a per capita basis. Since 2007, the U.S. has decreased its electricity production, including on a per capita basis.
In the 1950s and 1960s, when U.S. electricity generation was increasing at a rapid pace, the seasonally adjusted U.S. electricity price index remained relatively stable. In January 1959, the electricity index stood at 29.2, according to BLS. A decade later, in January 1969, it was 30.2—an increase of 3.4 percent over a 10-year span.
That 3.4-percent increase in the index from January 1959 to January 1969 was less than the 4.4 percent the index increased from January 2013 to January 2014.
Over the last seven years, according to the EIA, the U.S. has actually decreased its total net electricity generation, although not in an unbroken downward line from year to year (generation did increase from 2009 to 2010 before going down again in 2011 and 2012).
The EIA has published historical data going back to 1949 on the nation’s annual total net electricity generation, which EIA measures in million kilowatthours.
In 1949, according to EIA, the U.S. produced 296,124.289 million KWH of electricity. By 1959, it produced 713,378.831 – an increase of 417,254.542 million KWH or about 141 percent.
In 1969, the U.S. produced 1,445,458.056 million KWH – an increase of 732,079.225 or about 103 percent from 1959.
In 1979, the U.S. produced 2,250,665.025 million KWH – an increase of 805,206.969 or about 55.7 percent from 1969.
In 1989, the U.S. produced 2,967,146.087 million KWH – an increase of 716,481.062 or about 31.8 percent from 1979.
In 1999, the U.S. produced 3,694,809.810 million KWH – an increase of 727,663.723 or about 24.5 percent from 1989.
In 2009, the U.S. produced 3,950,330.927 million KWH – an increase of 255,521.117 or about 6.9 percent.
In 2007, according to EIA, the U.S. generated a net total of 4,156,744.724 million KWH of electricity, which, so far, is the historical peak. In 2012, the last year for which full data is available, the U.S. generated a net total of 4,047,765.26 million KWH. That represents a drop of 108,979.464 million KWH – or about 2.6 percent – in the nation’s electricity production since 2007.
CNSNews.com divided the million KWH of electricity generated each year in the United States, according to EIA, by the number of people in the U.S. population as of July of that year (as estimated by the Census Bureau) to derive a number for per capita electricity production (see chart).
As with overall electricity production, per capita production exhibited decelerating growth over the decades, peaked in 2007, and has since declined.
From 1950 to 1959, per capita total electricity generation (in million KWH) grew by 83.11 percent; from 1960 to 1969, it grew by 69.76 percent; from 1970 to 1979, it grew by 33.51 percent; from 1980 to 1989, it grew by 19.25 percent; from 1990 to 1999, it grew by 11.25 percent.
From 2000 to 2009, per capita total net electricity generation in the United States declined by 4.45 percent.
In 2007, when U.S. electricity generation peaked at 4,156,744.724 million KWH, per capita production also peaked at 0.013799 million KWH for each of the 301,231,207 people in the country as of July of that year.
In 2012, the U.S. generated 4,047,765.26 million KWH for a population of 313,914,040—for a per capita production of 0.012895. That means per capita electricity production in the U.S. declined by about 6.6 percent in five years.
The downward trend in U.S. electricity production continued into 2013. The EIA’s latest Monthly Energy Review, which includes data through October 2013, indicates that in the first ten months of 2013, the U.S. generated a total of 3,392,101 million KWH of electricity, down from the 3,407,155 million KWH produced in the first 10 months of 2012.
The Monthly Energy Review also indicates that a large part of the decline in U.S. electricity generation has come from a decrease in the electricity produced by coal – which has not been replaced by a commensurate increase in the electricity produced by natural gas or the “renewable” sources of wind and solar.
In 2007, the year U.S. electricity generation peaked at 4,156,745 million KWH, coal accounted for 2,016,456 million KWH of that production – or 48.5 percent of it. Natural gas, then the nation’s second largest generator of electricity, accounted for 896,590 million KWH of total production – or about 21.6 percent.
In 2007, wind generated 34,450 million KWH – or about 0.8 percent of the nation’s supply that year. Solar generated 612 million KWH – or about 0.0147 percent of the national supply.
By 2012, when U.S. electricity generation had dropped to 4,047,765 million KWH, coal generated only 1,514,043 million KWH – or 37.4 percent of the national supply.
Between 2007 and 2012, the nation’s annual coal-fired electricity generation declined by about 25 percent, or 502,413 million KWH. The combined increases in natural gas, wind and solar did not make up for this decline. In 2012, natural gas produced 1,225,894 million KWH, up 329,304 million KWH from 2007; wind produced 140,822, up 106,372 million KWH from 2007; and solar produced 4,327 million KWH, up 3,715 million KWH from 2007.
The combined 439,391 million KWH increase in electricity generation from natural gas, wind and solar did not cover the 502,413 million KWH decline in the electricity generated by coal.
Coal was not the only source that produced less electricity in 2012 than in 2007, according to the EIA data.
Electricity from nuclear power plants dropped from 806,425 million KWH in 2007 to 769,331 in 2012 – a decline of 37,094 million KWH or 4.6 percent.
Electricity generated from petroleum sources dropped from 65,739 million KWH in 2007 to 23,190 million KWH in 2012—a decline of 42,549 million KWH or about 64.7 percent.
Conventional hydroelectric means of generating electricity hit their peak in 1997, a decade before overall electricity generation peaked in the United States. In that year, the U.S. produced 385,946 million KWH of electricity through conventional hydroelectric power. By 2012, that had dropped to 276,240 million KWH, a decline of 109,706 million KWH or 28.4 percent.
Click HERE For Rest Of Story
The fourth Georgia hospital in two years is closing its doors due to severe financial difficulties caused by Obamacare’s payment cuts for emergency services.
The Lower Oconee Community Hospital is, for now, a critical access hospital in southeastern Georgia that holds 25 beds. The hospital is suffering from serious cash-flow problems, largely due to the area’s 23 percent uninsured population, and hopes to reopen as “some kind of urgent care center,” CEO Karen O’Neal said.
Many hospitals in the 25 states that rejected the Medicaid expansion are facing similar financial problems. Liberal administration ally Think Progress has already faulted Georgia for not expanding Medicaid as Obamacare envisioned.
But the reality is more complicated. The federal government has historically made payments to hospitals to cover the cost of uninsured patients seeking free medical care in emergency rooms, as federal law mandates that hospitals must care for all patients regardless of their ability to pay.
Because the Affordable Care Act’s authors believed they’d forced all states to implement the Medicaid expansion, Obamacare vastly cut hospital payments, the Associated Press reports.
The Supreme Court ruled that states could reject the Medicaid expansion in 2012, as part of the decision that upheld Obamacare generally. Since that decision, the Obama administration has so far instituted 28 unilateral delays and changes to the health care law’s implementation without congressional approval, Fox Business reports.
From verifying eligibility for subsidies to enforcing employer requirements, the Obama administration has already taken a hacksaw to the health care reform law, but it has made no changes to the provision raising problems for half the nation’s hospitals.
While the feds wait for financial pressure to force states to act, several state governments have been taking things into their own hands. Some have criticized these moves as “hospital bailouts.”
Along with Barack Obama’s promise of “if you like your healthcare plan, you can keep your healthcare plan,” was his declaration that “people with pre-existing conditions shouldn’t be penalized.”
Yeah, well, that was then and this is now. People with serious pre-existing diseases, precisely those Obama said the “Affordable Care Act” would help, could find themselves paying for expensive drug treatments with no help from the healthcare exchanges.
Those with expensive diseases such as lupus or multiple sclerosis face something called a “closed drug formulary.” Dr. Scott Gottlieb of the American Enterprise Institute explains:
“If the medicine that you need isn’t on that list, it’s not covered at all. You have to pay completely out of pocket to get that medicine, and the money you spend doesn’t count against your deductible, and it doesn’t count against your out of pocket limits, so you’re basically on your own.”
But didn’t Obama pledge – multiple times – to help those with pre-existing conditions, a: get covered, and, b: control their cost of healthcare? Here’s the reality, according to Dr. Daniel Kantor, who treats MS patients and others with neurological conditions:
“So it could be that a MS patient could be expected to pay $62,000 just for one medication. That’s a possibility under the new ObamaCare going on right now.”
Moreover, Dr. Kantor worries that “this may drive more patients” to not buy their medicines, “which we know is dangerous,” he says. “We know MS can be a bad disease when you’re not treating it. When you’re treating it, for most people they handle it pretty well, but we know when you don’t treat (it), it’s the kind of disease where people end up in wheel chairs potentially.”
And so it continues. What began with the botched rollout of a website, continued with millions of health insurance cancellation notices, and will undoubtedly face a year when the other shoe continues to drop, we are in the midst of doing exactly what Nancy Pelosi infamously said before the bill became law: we are “finding out what’s in it” – and we don’t like it.
On Monday, four members of an anti-fracking group wound up in jail for using bicycle locks and glue to fasten themselves to gas pumps at a petrol station in Great Lever, England. The group sacrificed themselves in order to protest the hydraulic fracking activities of Total, a French petroleum company.
But, to their embarrassment, the group sacrificed themselves to the wrong petrol station, which was no longer owned by Total. The petrol station was owned by Certas Energy, who neglected to take down the signs after buying the station.
The petrol station’s manager, Reezwan Patel commented that some protesters were peaceful, but that those who shackled themselves to the pumps “were stupid and have cost us a lot of money.” He added that, “We had to close for six hours, so with the loss of customers and the damage to the pumps, it could be a couple of thousand pounds we have lost.”
The four activists were not only ridiculed by Reezwan but also were excoriated by the local environmental group, the Bolton Green Party. The party chairman, Alan Johnson exclaimed, “I was very annoyed, and I have to stress that these people have nothing to do with our protest. We were there to protest peacefully, and warn people about the dangers of fracking, and these people have put themselves, and others, in danger with what they did.”
Throughout Bolton, a borough of Greater Manchester, anti-fracking groups have been rallying to protest hydraulic fracturing or “fracking” in the UK. Fracking is controversial because of the potential risks the process may have on the environment and the water supply.
A House Natural Resources Committee investigation has found that President Obama’s Office of Management and Budget ordered that sequestration cuts be applied retroactively to funding for rural schools over the opposition of the Agriculture Department.
The committee’s report released today, “A Less Secure Future for Rural Schools: An Investigation into the Obama Administration’s Questionable Application of the Sequester to the Secure Rural Schools Program,” detailed how last February the USDA had determined 2013 sequestration wouldn’t apply to 2012 funds that had already been distributed in the program. The White House stepped in and overruled the USDA, though both agencies haven’t turned over numerous subpoenaed documents that could reveal more behind the decision.
The Secure Rural Schools program helps provide rural counties with funds for teachers, schools, police officers, emergency services and infrastructure – “necessary because the federal government had failed to uphold its century-old promise to actively manage our national forest to provide a stable revenue stream for rural counties containing national forest land,” Chairman Doc Hastings (R-Wash.) said in reference to the timber industry link.
The program dates back to a 2000 bill, which was extended in July 2012 for that fiscal year. The $323 million in funds were doled out to 41 states by the USDA in January 2013. But two months later, after sequestration went into effect, the Obama administration announced it wanted $17.9 million back – prompting bipartisan backlash from governors and congressional representatives of the affected states.
“The Obama administration appeared intent on making this sequester as painful and visible as possible, and this was another example. Instead of working with Congress to make responsible cuts and reforms, the administration took the political opportunity to go after funds used to pay teachers and police salaries,” Hastings said at a hearing on the report today.
The chairman expressed his “frustration and disappointment in the Obama administration for repeatedly stonewalling Congress and stalling our legitimate oversight efforts” – ignoring requests for documentation and forcing the committee to issue subpoenas. Agriculture Secretary Tom Vilsack turned down a request to testify, as did U.S. Forest Service Chief Tom Tidwell, USDA General Counsel Ramona Romero, and OMB Director for Budget Brian Deese.
The only witness sent by the administration was USDA Undersecretary for Natural Resources and the Environment Robert Bonnie.
Ranking Member Peter DeFazio (D-Ore.) said it was a case of “you create a bad law, the administration applies the bad law.”
“It’s nothing really to investigate here. But we can waste a couple hours on it instead of doing something proactive to try and figure out how we are going to better provide for counties, schools and economic activity in rural areas,” DeFazio said.
Bonnie similarly testified that “the negative impacts of sequestration on Secure Rural Schools demonstrate that sequestration is a bad policy.”
He said that 19 states weren’t able to give back the funds as requested under sequestration, with half a dozen in the administrative appeals process. They could get docked for “outstanding debt” in the distribution of fiscal year 2014 funds.
“One option is to withhold dollars from Secure Rural Schools in F.Y. ’14. A second option is to withhold it through the departmental funds that may go to states. A third option is to refer to Treasury,” Bonnie said.
The report by the committee’s Republican majority summarized that “the Obama Administration complied with the law to make a SRS payment authorized in FY 2012, but then acted to retroactively apply the FY 2013 sequester to payments that had already been disbursed with the full knowledge that sequestration was set to take effect. This action demonstrates an obvious attempt of the Administration to make the sequester appear as ‘painful as possible.’”
However, the report notes, none of the responses from OMB or USDA on the incident “included internal emails or other documents that would shed light on the inner workings of the Obama Administration or how the decision to apply the sequester was made or how it was implemented.”
Over the course of the investigation the OMB has provided more than 1,300 pages of documents and the USDA more than 2,200 pages.
“Given the change in USDA’s legal analysis, pressure by the White House’s OMB, and the choice to apply the sequester of SRS funds as broadly as possible, it is clear that Congress, states, and rural communities were right to question whether these decisions were correct and made for any reason other than to make sequestration as visible and painful as possible in rural communities across the country,” the report states.
Hastings said the ultimate solution needs to be Senate passage of H.R. 1526, the Restoring Healthy Forests for Healthy Communities Act, which already passed the House with bipartisan support and is intended to stop the Band-Aid for timber-reliant communities.
“The Secure Rural Schools program was intended to be a short-term solution and counties are still lacking a stable, dependable source of revenue,” Hastings said.
Article V Symposium: Professor Philip Prygosi
Article V Symposium: Professor Robert Natelson (Part 1)
Article V Symposium: Professor Robert Natelson (Part 2)
Article V Symposium: Bill Walker (Part 1)
Article V Symposium: Bill Walker (Part 2)
Article V Symposium: Joel Hirschhorn (Part 1)
Article V Symposium: Joel Hirschhorn (Part 2)
Article V Symposium: Judge Thomas Brennan (Part 1)
Article V Symposium: Judge Thomas Brennan (Part 2)
Article V Symposium: William Fruth (Part 1)
Article V Symposium: William Fruth (Part 2)
Debunking The Myth Of The Runaway Convention: Goldwater Institute – Lecture
Debunking The Myth Of The Runaway Convention: Goldwater Institute – Q&A
Conference On Article V Convention: Harvard Law School
Conference On Article V Convention: American Legislative Exchange Council
Click HERE to purchase Joel Hirschhorn’s book Delusional Democracy: Fixing The Republic Without Overthrowing The Government
Click HERE to purchase Barbara Perry and Paul J. Weber’s book Unfounded Fears: Myths And Realities Of A Constitutional Convention
Click HERE to purchase Mark Levin’s book The Liberty Amendments: Restoring The American Republic
Scientists have discovered huge reserves of freshwater beneath the oceans kilometres out to sea, providing new opportunities to stave off a looming global water crisis.
A new study, published December 5 in the international scientific journal Nature, reveals that an estimated half a million cubic kilometres of low-salinity water are buried beneath the seabed on continental shelves around the world.
The water, which could perhaps be used to eke out supplies to the world’s burgeoning coastal cities, has been located off Australia, China, North America and South Africa.
“The volume of this water resource is a hundred times greater than the amount we’ve extracted from the Earth’s sub-surface in the past century since 1900,” says lead author Dr Vincent Post of the National Centre for Groundwater Research and Training (NCGRT) and the School of the Environment at Flinders University.
“Knowing about these reserves is great news because this volume of water could sustain some regions for decades.”
Dr Post says that groundwater scientists knew of freshwater under the seafloor, but thought it only occurred under rare and special conditions.
“Our research shows that fresh and brackish aquifers below the seabed are actually quite a common phenomenon,” he says.
These reserves were formed over the past hundreds of thousands of years when on average the sea level was much lower than it is today, and when the coastline was further out, Dr Post explains.
“So when it rained, the water would infiltrate into the ground and fill up the water table in areas that are nowadays under the sea.
“It happened all around the world, and when the sea level rose when ice caps started melting some 20,000 years ago, these areas were covered by the ocean.
“Many aquifers were – and are still – protected from seawater by layers of clay and sediment that sit on top of them.”
The aquifers are similar to the ones below land, which much of the world relies on for drinking water, and their salinity is low enough for them to be turned into potable water, Dr Post says.
“There are two ways to access this water – build a platform out at sea and drill into the seabed, or drill from the mainland or islands close to the aquifers.”
While offshore drilling can be very costly, Dr Post says this source of freshwater should be assessed and considered in terms of cost, sustainability and environmental impact against other water sources such as desalination, or even building large new dams on land.
“Freshwater under the seabed is much less salty than seawater,” Dr Post says. “This means it can be converted to drinking water with less energy than seawater desalination, and it would also leave us with a lot less hyper-saline water.
“Freshwater on our planet is increasingly under stress and strain so the discovery of significant new stores off the coast is very exciting. It means that more options can be considered to help reduce the impact of droughts and continental water shortages.”
But while nations may now have new reserves of freshwater offshore, Dr Post says they will need to take care in how they manage the seabed: “For example, where low-salinity groundwater below the sea is likely to exist, we should take care to not contaminate it.
“Sometimes boreholes are drilled into the aquifers for oil and gas exploration or production, or aquifers are targeted for carbon dioxide disposal. These activities can threaten the quality of the water.”
Dr Post also warns that these water reserves are non-renewable: “We should use them carefully – once gone, they won’t be replenished until the sea level drops again, which is not likely to happen for a very long time.”
The study, “Offshore fresh groundwater reserves as a global phenomenon” by Vincent E.A. Post, Jacobus Groen, Henk Kooi, Mark Person, Shemin Ge and W. Mike Edmunds, is published in the latest issue of Nature.
On November 6th, Austin, Texas-based Solid Concepts announced the successful printing and firing of the world’s first 3D-printed metal 1911 handgun.
The gun, printed by Solid Concepts, looks like any mil spec 1911 you could find in a local gun store.
Every aspect of the gun is printed using Solid Concept’s Direct Metal Laser Sintering (DMLS) process – except for the springs. During tests the gun repeatedly fired and cycled using standard Winchester .45 cal ammunition. Accuracy was impressive.
After the successful tests, Solid Concepts’ Phillip Conner said, “We were not looking for a cheaper, easier, better way to make a gun–that wasn’t the point at all.” Rather, he said they were trying to “dispel the commonly held notion” that their DMLS parts and guns “are not strong enough or accurate enough for real world applications.”
Solid Concepts holds a federal firearms licence (FFL).