by Malcolm Barr
Aachen, Germany/Citrus Heights, CA, September 30, 2011— KISTERS presented its final results of the Surface Water Interoperability Experiments #1 and #3 to the Open Geospatial Consortium (OGC) in Boulder, CO this past week. The end result of this work was KiWIS (KISTERS Web Interoperability Solution), the first commercially available and robust service module on the market. Used in combination with WISKI or Hydstra, KiWIS is able to both consume and publish real-time hydrological data over the Internet (http://kiwis.kisters.de) using open standards like WaterML 2.0 and SOS. This combination can also be placed “on-top” of existing time series data archives, providing a quick and efficient method for disseminating and retrieving data from the Internet.
KISTERS worked closely with leading organizations such as the Global Runoff Data Center (GRDC), Commonwealth Scientific and Industrial Research Organisation (CSIRO), US Geological Survey (USGS), 52 North, SANDRE, German Waterways DLZIT, Australian Bureau of Meteorology (BOM), and Geological Survey of Canada (GSC) to advance the development of WaterML 2.0, test its use with the OGC standard SOS.
A WISKI system was implemented for the Global Runoff Data Center at KISTERS head office in Aachen, thus enabling the GRDC to view all of their meta data, specific surface water attributes, time series data, and derived data products like Daily Mean, Mean Lowest Flow, Monthly Flow, Monthly Highest Flow etc. within the WISKI system.
On top of KISTERS GRDC System KISTERS set the KiWIS service. KiWIS is able to publish WISKI content via the SOS API as WaterML 2. In addition KiWIS can publish the same content as WaterOneFlow and WaterML 1. KiWIS also supports standards such as ESRI’s layer packages. With KiWIS on top of the GRDC system KISTERS is now able to bring the GRDC data into the cloud for everyone to use.
Michael Natschke, WISKI Product Manager for KISTERS says, “We are extremely excited at the outcome of the work done in these IE’s. If it were not for the immense efforts put forward by all the initiators and participants in the SW and GW IE’s, KISTERS would not have been able to develop KiWIS. Interoperability issues around data and data formats between and within the surface water, groundwater and meteorological domains are now being solved with KiWIS. What’s even better, is the fact that this data can be seen by any client application able to use these open standards.” Stefan Fuest, Product Manager, Web/GIS for KISTERS notes, "KiWIS makes use of KISTERS' proprietary high speed data protocol for moving massive amounts of time series data to the Internet, unrivaled by any other commercial water data management software in the marketplace. I want to thank everyone involved in keeping KISTERS at the forefront of implementing open standards for water data management."
Peter Fitch, the research program leader for Environmental Information Systems at CSIRO, coordinator of the Interoperability experiment and key member of the Hydro Domain Working Group at the OGC which is working to develop WaterML 2.0 comments, “I share Michael’s enthusiasm for the development of open standards like WaterML 2.0. Having vendors like KISTERS participate in the development of open standards and producing tools such as KiWIS is critical for the success of the OGC Hydro Domain working group and its goals for improved access to and interoperability of hydrological data.”
Ulrich Looser, Head of the GRDC, emphasizes that “the advantages of web services are clearly recognized by the GRDC and a process has started to develop web services for a number of GRDC requirements. We certainly look forward to continuing our close work with KISTERS, not only within the OGC Hydro Domain Working Group, to build upon the many successes we have seen thus far.”
KISTERS develops industry-proven software solutions for the management of water, air, and energy resources. KISTERS has high-capacity scalable systems that can be easily tailored to suit both their customers’ and local demands. KISTERS software is implemented around the world and conforms to all the major industry standards. Their customers are private companies - from small firms up to well-known multinationals - as well as all levels of government.
by Malcolm Barr
Over the weekend, most of the North Island was lashed by heavy rainfall resulting in many flooded homes, businesses and properties. Hamilton wasn't as badly affected as other areas, but my basement was 15-20cm deep in water at midday Sunday and we have suffered some damage to some belongings that were stored there.
Obviously I couldn't do anything to stop the rain (and after sandbagging and setting up a pump I couldn't do much more for the basement either), but I did find it interesting to be able to see online how much rain had fallen and what the river levels were in the region. Using the Environment Waikato website, I was able to get up to date information (within 15 minutes), customise the graphs to show any time range, and aggregate the data to show raw readings, incrementing data, or hourly/daily totals. I could also download the readings as a csv file.
Most members of the public are unaware that this service exists, but most regional councils in New Zealand will allow you free access to up to date environmental monitoring information. Those councils running our HydroTelTM telemetry system that also have the HydroTelTM web component can offer data to the public that is as up to date as any data they have themselves. Some of them include more than just rainfall and river levels such as air quality and groundwater levels.
Here are some links to councils offering this service:
Auckland Council (add environmental monitoring widget under Tools)
Hawke's Bay Regional Council
Bay of Plenty Regional Council
Marlborough District Council
- Malcolm Barr
by Malcolm Barr
Our dataloggers are designed to work in some pretty extreme conditions and to keep working for a long time unattended. We put a lot of effort into ensuring this is the case and are proud of our results.
Occasionally the limits of our gear have really been put to the test. In April, a major fire on the historic wharf at Raglan destroyed an iRIS 320 and all of the monitoring equipment with it. Then earlier this month, another iRIS 320 came close to suffering the same fate on the Waikanae river. The iRIS is actually in the enclosure at the top of the green pole and was unharmed by the fire.
The harsh treatment of dataloggers is not new. This logger took a direct lightning strike via the radio aerial. You can see where the pressure inside the logger has bent the lid against the central screw holding it on (unfortunately this logger didn't survive and is now a museum piece):
The good news is that these are definitely the extremes!
- Malcolm Barr
by Michael Cook
January 1, 2011... The world awoke - or rather didn't wake - to the first widely publicised technical glitch of the new year.
In a Y2K-like scenario, the ubiquitous Apple iPhone and iPod revealed a software bug that affected their alarm clock function. It was probably only a tiny error in a huge number of lines of code. However the effect was much bigger. There were thousands of angry Tweets, blog entries and emails from users who had overslept and were late for work - or worse, missed flights and trains. I also have an iPhone 4 and use the alarm regularly, but was fortunately unaffected as it was a holiday!
This raises the topic of our reliance on technology in the modern world and the explosive rate at which it is changing. Who and/or what is actually driving the change? Is it "market-push" or "market-pull"? Is it manufacturers or consumers - or both? Is it just about dollars or perhaps improving society or simply just an ongoing pursuit of technology for technology's sake?
Here at iQuest we are both developers and consumers of technology. We are a link in a chain, using other vendor's products to develop our own. Hardware components, software development tools and communication networks - we are reliant on a wide range of technology. Downstream, our users then rely on our integration and application of these technologies, often in mission-critical environmental-monitoring systems.
Sometimes though, to some it feels like it is all going far too quickly. I came across a very poignant blog a while ago that brings an interesting perspective on this topic. Titled "A Call for Revolution against Beta Culture", it is food for thought as we jump head first into the new year.
We look forward to what's ahead. One thing is for sure - it won't be boring!
- Michael Cook
by Michael Cook
Installations using iQuest products are found in some pretty extreme locations. They range from the searing heat of the Australian Outback to the freezing cold, buried in snow high on New Zealand mountains. Dataloggers are monitoring a range of parameters out in the remote environment with data transferred by telemetry back to 'civilisation'. However, a recent news item in the Los Angeles Times about the NASA Mars Exploration Rovers project puts the term 'extreme' into a whole new perspective! It makes for interesting reading for those with a passion for innovation and projects truly "outside the box"...
After six highly successful years of exploring the red sands of Mars, NASA's rover Spirit will rove no more. With its six wheels stuck in powdery sand and two wheels no longer working at all, the resilient little explorer will become an immobile scientific observatory - if it can survive the harsh temperatures of the upcoming winter. "Its driving days are likely over," said Doug McCuistion, director of NASA's Mars Exploration Program.
If Spirit can be awakened after what could be a six-month hibernation, researchers will use it to attempt to answer one of their most pressing questions: whether the red planet has a solid iron core or a liquid one. If the vehicle can't be revived, it will still have far surpassed scientists' original expectations and its design life of three months, traveling nearly 12 miles across the barren surface of Mars and finding strong evidence that water once altered the planet's terrain. Spirit's twin, Opportunity, is still moving across the Martian surface farther north nearer the equator and on the other side of the planet, and continues to send back valuable data. Opportunity has successfully weathered every Martian winter so far because "it is in a different thermal environment," McCuistion said, and the team that controls it doesn't expect any troubles for it this winter.
Spirit's problems began nine months ago when its wheels broke through the thin Martian crust and sank into powdery sand. Breaking free proved difficult because one of the rover's six wheels had broken down three years earlier. A second wheel became immobilized during the extrication attempts, leaving the vehicle with three good wheels on its left side and only one on its right. So far, the efforts to free it have only dug the wheels in deeper. About a week and a half ago, with winter approaching, the team shifted its emphasis from extricating the rover to positioning it so that its solar panels would receive more sunlight, rover driver Ashley Stroupe of the Jet Propulsion Laboratory said at the news conference. The rovers were designed and built at the La Cañada Flintridge laboratory, and engineers have been guiding them from that location. The most likely scenario is that Spirit's power supply will get lower and lower and eventually it will shut down and go into hibernation mode until spring brings more sunshine.
The question is whether engineers will be able to revive it to use that sunshine. NASA engineers expect temperatures around Spirit to fall into the minus 40s this winter. The craft was designed to operate in temperatures as low as minus 40 degrees and to survive temperatures as low as minus 67 degrees. "But that is with a brand-new rover fresh out of the box," said John Callas, JPL's project manager for the rovers. If Spirit does survive, researchers hope to get many more scientific results from it, said Steve Squyres of Cornell University, the project's principal investigator. By tracking Spirit's radio signal precisely over a long time -- perhaps six months or more -- the team will be able to monitor Mars' "wobble" in its orbit. That will allow scientists to determine whether the planet has a solid or liquid core. "This is totally new science, really fundamental stuff" that can be achieved only with a stationary platform, he said. By looking around the craft for a long period, he added, the team will also be able to monitor how the planet's atmosphere interacts with its surface. And finally, by continuing to dig at the current site, the team will be able to characterize the soil much more thoroughly than has been achieved anywhere else on Mars. "The bottom line is, we are not giving up on Spirit," Squyres said.
by Malcolm Barr
Normal people carrying small sensors and cell phones are set to become an integral part of environmental monitoring in San Diego as part of a project called CitiSense. This is the aim of a team of computer scientists at the University of California, San Diego.
From the University's news center - "The goal of CitiSense is to build and deploy a wireless network in which hundreds or thousands of small environmental sensors carried by the public rely on cell phones to shuttle information to central computers where it will be analyzed, anonymized and reflected back out to individuals, public health agencies and San Diego at large. At the same time, the sensor-wearing public will have the option to also wear biological monitors that collect basic health information, such as heart rate. This combination of sensors will enable the team’s medical team to run exacting health science research projects, such as investigating how particular environmental pollutants affect human health."
They have won a (US)$1.5 million grant from the National Science Foundation to "solve the many technical challenges that stand in the way of applications that merge the cyber and physical worlds."
It will be interesting to follow this project as it proceeds, as it moves the telemetry of environmental data from the control of the collecting agency into the control of the general public. There will also be a mountain of challenges to overcome in terms of the accuracy of the data, however they have some pretty smart brains working on it.
What do you think of their plan? You can leave a comment below.
- Malcolm Barr
by Malcolm Barr
"World gets warmer as climate talks start" was the title of a New Zealand Herald article yesterday. It asserts that "this year is likely to be among the 10 hottest years on record", and that "the decade of the 2000s (2000-2009) was warmer than the 1990s, which were warmer than the 1980s". A similar article screened on TVNZ the previous day. Various media outlets around the world also had similar articles.
There are two interesting things at play here. According to the article, instrumental climate recording began in 1850, so we are talking about only 160 years of observations; and the timing – this data has been released during the United Nations Conference on Climate Change.
It might just be me, but it seems no coincidence that an alarmist media release has occurred just before world leaders debate a new protocol to replace the Kyoto Protocol in a few years. If they need a reason to be able to introduce radical measures, this data will be a fantastic help.
Whichever side of the global warming debate you sit on (very few people are on the fence!), you can’t deny that average global temperature is increasing and has been doing so for at least the last 50 years, there is ample evidence to show this. But can we use data from only 160 years to make decisions that will affect the way each of us live our lives?
I’m certainly not saying that the conference in Copenhagen shouldn’t be happening, or that we shouldn’t all be doing our bit for the environment, just that we should treat every piece of information we receive – from both sides of the debate – with a healthy dose of skepticism and apply a little of our own intelligence and critical thinking before swallowing it up.
- Malcolm Barr
by Michael Cook
The successful launch of the Rocket Lab "Atea-1" rocket on November 30th was a great testimony to Kiwi innovation and perseverance. Right up to the last minute of the delayed take-off, the atmosphere of excitement and suspense was maintained for media and supporters as a few "technical-hitches" were sorted out.
Of special interest to us is the on-board avionics (control, data logger and satellite telemetry) system. We know only too well that having to design and build gear that can endure severe acceleration, temperature and pressure changes pushes technology to the limit. There is no room for failure once the launch sequence is initiated! The other major factor in rocket design is an absolute focus on weight reduction. Every unnecessary gram limits the maximum altitude achievable. This makes, for example the choice of battery paramount to maximise life against weight.
Working on a shoe string budget over several years, the small team pulled off what is an incredibly difficult feat for those who do not have access to the resources of big aerospace business or government. Over the years, Rocket Labs has developed some cutting-edge technology that has generated interest from around the world. Specialised ablative coatings and improved rocket fuels have been spin-offs. However, their technology road map has a common goal of lowering cost and providing environmentally friendly solutions to give science and industry access to sub-orbital space.
Although the Rocket Lab team appear to be the ultimate "geeks" who are passionate about rocket science and things that go bang, they are commercially astute and very keen to promote NZ as a centre of innovation and excellence. They have proved that with enough passion, enthusiasm and the right connections it is possible to achieve breakthrough and get the attention of the "big guys".
As another small NZ company involved in rapidly changing technology in a global environment, we salute Rocket Lab and look forward to following the progress of what is already a very unique and interesting company.
- Michael Cook
by Malcolm Barr
Regional councils are becoming ever stricter when it comes to compliance, recording and reporting for water take consent holders. It is common for consents to require daily submission of data electronically. This means a quality datalogger and a quality telemetry system is required. For the average user, setting up all that is required to achieve this can be a major headache.
Leaving aside the flow meter equipment (we don’t supply or install them, so it isn’t up to us to comment), how do you know if the datalogger and telemetry solution you have chosen is going to meet your requirements long term? Firstly, I would separate the question into two parts – the datalogger and the telemetry requirements.
Make sure the datalogger is a proven model. Cheaper is definitely not going to be better in this case! You want to make sure the datalogger is flexible enough to handle multiple inputs, and has the capacity to store large amounts of data in the unlikely event that your telemetry system fails for a period of time (maybe as a result of a mobile phone provider problem). If you aren’t sure, maybe contact someone in the environmental monitoring team at your regional council. They will have experience with a range of dataloggers.
For the telemetry requirements, check first of all that your provider will be able to get the data from the datalogger and to the council in the format they require, and as frequently as they require it. They may also be able to offer you a service where you can view your data on the Internet. The data will probably be transmitted from the logger over one of the mobile networks, so data charges are going to be important (this will be one of the few ongoing costs). The total amount of data is relatively small, so check that you aren’t being charged for a standard Internet data package (500MB or more) as this is way too much.
Lastly, You should also check whether your telemetry provider will notice if your datalogger goes offline or there are any problems. If data is being submitted daily, they should check daily that it has been submitted. Your regional council won’t be happy if it takes a week or two of no data before anyone does anything about it.
- Malcolm Barr