Wednesday, March 11, 2015

Changing an Industry. First by accident and then by design.

Changing an Industry. First by accident 
and then by design.

"There are two kinds of companies, those that work to try to charge more and those that work to charge less. We will be the second." - Jeff Bezos, CEO of Amazon

When SampleServe.com first started back in 2001, the goals we had for the company were completely different than the goals we have now.  Back then the internet was just gaining momentum and not everyone was comfortable having their data online.  As a matter of fact, some people were actually outright opposed to it. 

In the beginning we wanted to be a "field services" company. In 2002 we developed an online data management tool to facilitate our own sampling. Being efficiency minded, we wanted to be able to lay out a project and print chain of custody's, sample bottle label's and sample location 
maps ahead of time so that our own field staff could sample quicker and with fewer mistakes.  The client access function where the client could log in and generate their own reports if they wanted too, was a byproduct of our own logistical needs.  It was actually one of our programmers at the time that suggested having the client login and print out some of their own tables. I said, "Great Idea! The data is already there, why not?"  So we added it.


Unfortunately, not much happened with SampleServe.com for a while, largely because I was busy doing other things.  In 2008 when I decided to give it my full attention again, things started taking off. The software was revamped and the selling went full force. The goal was still to be a "field services" company.  Again, the software was meant to be a "value added" to the client and something that separated us from other companies.

It was in early 2011 while meeting with a large nationwide engineering company to discuss their sampling needs when they said, "Well, we really want to keep doing our own sampling, but we really like your software.  
Can we just use your software?"  I had actually heard that question a few times previously, but this time it struck home. It occurred to me that we should be a software company also. It took about a year to revamp the software to allow clients access to the ability to set up their own projects (i.e. print their own chain of custodies and sample bottle labels, etc..) but when complete, we were officially a software company.


We've been moving along and continually improving our software since then but now it's time for the next iteration of SampleServe.com.  Just for grins, I'm calling it SampleServe 2.0.

SampleServe 2.0 is still in field services, that's been our bread and butter, but now we're emphasizing software and developing our own tablet/smart phone application. People have asked why we don't get out of the field services business if we want to be a software company. My answer is that laboratory services and the testing and analysis function will soon be moving into the field and we want to be in a position to take advantage of that market as the technology developments.

Mobile smart devices and sensor technologies like this onethat can detect lung cancer using a breathalyzer like device and other technologies being developed like this one, are being developed all the time. I imagine a time in the
not to distance future were the results from sampling of any kind are displayed almost instantly on a mobile device (Think Star Trek Tricorder).  So the question now is, is it better to be a laboratory trying to move into the field services arena or a field services company trying to provide lab services?


Here is how SampleServe 2.0 will be the same.
  • We will still do field sampling for clients that want the service (using "tricorders" as they become available).
  • We will still allow clients that do their own sampling access our project management and reporting functions.
Here is how SampleServe 2.0 will be different:
  • SampleServe 2.0 will be able to be accessed directly through your current participating laboratory's website. I have one poll question for those that are interested here: Lab Data Reporting Poll
  • We will have improved project set up and project management functions. Many functions that are repetitive will be automated.
  • Project management and reporting for other media will be available (e.g. soil, air, surface water, sediment, soil vapor, asbestos, mold and radon). 
  • Sampling instructions and access to the system will be available through a tablet/smartphone application.  Data will be able to be collected and uploaded directly from the application.  Photos and GPS coordinates and sampling details will be able to be input through the applications features.
  • Sample bottle labels will be able to be printed in the field with QR Codes and text that identifies all sampling details using a small inexpensive simple 12-volt label printer.
  • Electronic "Chains of Custody" will be automatically generated and e-mailed to the lab ahead of the samples for efficient automated sample login by laboratories using the sample bottle label QR Codes.  
I'd like to say exactly when SampleServe 2.0 will be completed, however, currently the timeline is inversely proportional to the budget. If you're interested in partnering on this industry changing project, I can be contacted directly at 231-218-7955 or via e-mail.  Ideal partners would be software programmers, laboratories or other venture stakeholders.

Your comments and questions about anything I've written are appreciated.

- Russell Schindler

Tuesday, January 13, 2015

Laboratory Data Deliverables

Laboratory Data Deliverables
" 
Civilization advances by extending the number of important operations which we can perform without thinking of them." 
-Alfred North Whitehead

I may be dating myself, but when I started working in the environmental industry, the fax machine was just becoming standard office equipment. They were slow, used thermal paper that came on a roll, and you had to cut the paper into the right size yourself. The paper was difficult to work with as it wanted to roll back up on itself. When we needed results from a laboratory in a hurry, we would ask them to "fax it over." Otherwise, laboratory 
reports were always printed on paper and mailed via US Postal service. Once the paper or fax reports were received, we would take the data and "maybe" put it into a table. We would have to re-type the data ourselves into whatever format we wanted it in. We didn't always do tables because I'm talking about a time before Microsoft Excel. Tables I worked on back then were mostly rudimentary tables constructed in a software called "WordPerfect". We would, however, always discuss the data, trends, exceedances, etc. in the text of the report. Full lab reports were always included as an appendix to the report and reports were always printed out and delivered on paper.


I spent hours and hours re-typing data from paper to a computer only to print it back out on paper again. It was a good thing both I and my employer were getting paid by the hour.

It was around the time that e-mail became mainstream in the late 1990s to early 2000s that the laboratories started giving you the option of having your data delivered via fax machine or e-mail, but it was still just on paper. It wasn't until the early to mid-2000s that most labs started giving you the option of having your data delivered as an Excel file.

Thus went the getting paid to re-type data. Think of the hundreds of millions of dollars saved collectively across the country over the years by end-user environmental clients simply by not having to re-type data because the file could be delivered electronically. Seems like a small thing now that we are all used to using the technology, but it was a huge savings overall. Laboratories that embraced the e-mail delivery early on gave themselves a competitive advantage over late adopters of the email delivery method. Although not having to re-type data cut into the billable hours of the engineering consulting firms previously re-typing the data, it ultimately made them more price competitive in the eyes of their end-user clients.

The disheartening thing is that it's been nearly 14 years since the first delivery of data via e-mail in an Excel file, and that's where the technology largely stands today. Some laboratories will send you an Excel file customized to your exact specifications, but it's still just a fraction of what the end user, the entity ultimately paying the bills, actually needs.

The end user (the regulated oil companies, manufacturing firms, developers, etc.) need full reports with all the full color data, graphs, and maps completed well beyond just simple Excel tables. The future I see is a scenario where the laboratory is providing the mechanism for the client to get all
those required full color reports as an electronic deliverables. It's the next logical technological progression in data deliverables. I've had conversations with numerous laboratories about taking the next step, as SampleServe's software has been developed for just this purpose. The resistance to date by many of the laboratories to the concept is somewhat baffling to me. The response I hear most often is "The consulting engineering firms will perceive that we are competing with them and might not use us anymore." At first glance I can understand this initial reaction, but it doesn't stand up to the history of technological advancements. If a technology delivers a comparable product, saves time, and saves money, it will ultimately win in the market place. The situation is similar to the days when Excel files were first delivered. I personally remember making the decision not to use a particular lab because they couldn't deliver data as an Excel file. Most labs wouldn't even think of not delivering data via Excel these days. No one would use them. There will be a day in the not too distant future when delivering data means delivering the data in as many formats and fashions as the client can think of. Not doing so will mean not getting the business.


If you would like to talk to SampleServe.com about your data deliverables potential, please contact me.

As always, your comments and questions about anything I've written are appreciated.

- Russell Schindler

Technological Unemployment

Technological Unemployment

"Technology... is a peculiar thing. It brings you great gifts with one hand, and it stabs you in the back with the other." 
-C.P. Snow

Wikipedia defines Technological Unemployment as "unemploymentprimarily caused by technological change," or more accurately technological innovation. In industries were "technological unemployment" occurs, productivity and profitability tend to increase. We don't generally think of increased unemployment as causing increased productivity.

Most "technological unemployment" tends to happen in industries and work scenarios that are easily automated like assembly and manufacturing. Think robots. 
 However, automation is occurring in more and more sectors not previously thought to be automatable. Fast food restaurants are starting to experiment with automation. I've personally seen robotic beverage machines. A machine that grabs the right size cup and fills it with the cold drink I've ordered. I've also seen a computer touch screen for placing your order, which replaces the fast food worker who used to take your order. Some restaurants are experimenting with replacing wait staff by placing iPad like devices on restaurant tables and you just order your food from there with a simple pull down menu when you're ready. No wait staff coming out to take your order. Need another beer? Just punch it in and someone will bring it out.


If you think robots caused "technological unemployment," wait until 3D printing (learn more here) becomes mainstream. Several 3D printing patents expired last year, which had been holding the 3D printer market back. Many others patents still exist, however, and with the innovation that is occurring in the 3D printer market and with the software and material science technology ever advancing, 3D printing is going to cause "technological unemployment" like we've never seen before. It will not just affect the manufacturing sectors, but also shipping, warehousing, and even construction and housing. Need a bridge? Print it. Need a house? Print it. 3D printers are cheaper than Chinese labor and there is no need to warehouse or ship parts and other various items across oceans. They will be able to be printed on-site or right next door in every hometown.

The explosion of software and mobile applications is also causing "technological unemployment." Ride sharing applications like Uberand Lyft have caused decreased revenues among taxicab companies. Room rental software like Airbnb, where you can rent a room at bed and breakfasts and hotels all over the world, is hitting large hotel chains with decreased revenues. The smart capitalist understands that productivity and profitability don't have to be tied to the number of employees a company has. As it's always been since the beginning of tool making, technology that lets you do what you have been doing with a lower overall cost is going to win in the market place.

SampleServe.com's own software is loathed by AutoCAD Technicians all over the country. It all but eliminates the need for AutoCAD technician time, significantly impacting their pocket books. Project management time is about 10-20% of 
what it used to be. The typical monitoring project is mostly data crunching and mapping. With our software, what used to take two days now takes about 10-20 seconds. The vast majority of the environmental industry bills by the hour, so there is this inherent subliminal resistance to technologies that cut into billable hours. It's an "It's working now, why would I want to change" mentality. I can't tell you how many times I've heard "How am I supposed to make any money if I use your software?" from a potential client.


Resistance to change is inherent in humans; we are all creatures of habit. Many of us however, are also curious, inventive, and competitive. In today's technological world and with the ever-increasing rate at which technology is advancing, if you don't want to become one of the "technologically unemployed," you had better make learning and embracing new proven technology a daily habit.

As always, your comments and questions about anything I've written are appreciated.

- Russell Schindler

Thursday, March 20, 2014

Electronic Chain of Custody and Matching Bar Code Sample Bottle Labels. Arriving Summer 2014!

Electronic Chain of Custody and Matching 
Bar Code Sample Bottle Labels. 
Arriving Summer 2014!

"Innovation comes from people meeting up in the hallways 
or calling each other at 10:30 at night with a new idea, or 
because they realized something that shoots holes in how 
we've  been thinking about a problem" - Steve Jobs

In our endless - and I do mean endless - pursuit of improving our application, we have begun work on the "next big thing." We call it our electronic chain of custody (ECOC). I mentioned this concept about a year ago and now we have begun programming work on this new feature.

Here's how it works: The project manager defines the project sampling scope, just as they have done in the past. Once the scope is agreed upon by the pertinent parties, the scope will be finalized, and at that point sample bottle labels with unique identifying bar codes can be printed along with an associated preliminary chain of custody. Once sample bottles are filled and sample bottle labels with the bar code are applied, the ECOC can be completed online using a laptop, tablet, or cell phone. At this point, any changes to the sample IDs can be made along with the sample dates, times, special notes etc. Once the ECOC is complete, the data file is automatically e-mailed to the respective laboratory. The lab can then sync this data file with its sample receiving system without having to manually enter sample data from a paper chain of custody. Once the sample bottle arrives at the laboratory the next day, the sample bottles are scanned in using a bar code reader, and the relevant data from that sample is finalized as received within the lab's data reporting system.

The benefits of the system for laboratories are fourfold. 1) It saves labs time and money on data entry, as all data is entered by the field personnel while in the field. 2) It minimizes the opportunity for data entry errors; sampling data is only entered one time. 3) It allows the lab to know exactly which samples are arriving the next day, allowing for staffing and equipment planning. 4) This bar coding ECOC feature is tied to our current sample project management application and allows a smooth transition from project planning to sampling to lab receiving and reporting, and then to completing the final reports with all the tables, charts, graphs, maps, and contouring needed. The entire application from planning to reporting can be uniquely customized and licensed for use directly by the laboratories themselves and can be re-offered to their clients as a customer service enhancement and source of additional lab revenue if desired. This allows the lab to take data reporting to the next level and beyond. Don't just give them the data - allow them to complete the entire report.

If you represent a laboratory and are interested in talking about this new application, please contact Russell Schindler at 231-218-7955 or by e-mail at schindler@sampleserve.com.

- Russell Schindler

Groundwater Elevation and Contaminant Levels, Correlation, or Causation.How can you tell?

Groundwater Elevation and Contaminant
Levels, Correlation, or Causation.   
How can you tell?

"The obvious is that which is never seen until someone 
expresses it simply." -Kahlil Gibran

Over the years I've heard a lot of explanations regarding groundwater elevations and subsequent increases and/or decreases in contaminant levels and the presence and disappearance of free product. I've heard people argue that increasing water levels increase contamination, and I've also heard the opposite. Some people believe that increasing water levels causes free product to disappear and that decreasing levels cause it to re-appear (which is counter-intuitive to a lay person). The question or debate is (all other things being equal): does groundwater elevation have an effect on contaminant levels? Furthermore, does groundwater elevation have an effect on the presence or absence of free product? These are a question of correlation or causation.

I've been doing this kind of work for about 26 years now, and I've learned enough to know that it all depends on the specific site and even the specific well.

Webster's Dictionary defines correlation as: "a relation existing between phenomena or things or between mathematical or statistical variables which tend to vary, be associated, or occur together in a way not expected on the basis of chance alone." Webster's defines causation as: "the relationship between an event or situation and a possible reason or cause."

Determining correlation, in most cases, is pretty simple. We tend to use graphs to visually plot and identify correlations. In the example to the right, you can see a strong correlation between the two variables; however, the graph implies a causation, which is ridiculous. Organic food doesn't cause autism. It was Mark Twain who said, "There are lies, damned lies and statistics." Determining causation requires a bit more thought and reason, and can be tricky to plot on a graph.

With regards to groundwater
Graph A
elevation and contamination, in Graph A you can see that groundwater contamination seems to go up and down in concert with the groundwater elevation. This is a strong correlation which implies causation. However, in Graph B, the correlation appears to be the exact opposite. Contaminant levels seem to go down with increasing groundwater elevations. The two causation conclusions implied by the graphs seem to
Graph B
conflict with each other. One of the interesting things about the data from these two wells is that they are taken from the same site across the same date range.

The explanation for the differing correlations is that one well is located in a source area and the higher groundwater elevations expose the groundwater to remaining residual source material in the soils and capillary fringe. This exposure thus causes increased contaminant leaching and migration to the groundwater. The down gradient well is subject to the dilution effects of the increase in clean water entering the aquifer at that well location.

The point of this article is not to argue the significance of these effects or whether there is any other explanation as to the cause, but instead to illustrate that simply visualizing the data correlations is important in evaluating and eventually determining causation. In this example case, seeing the correlation of the source well increase as compared to the down gradient well decrease, made it obvious source soil material existed. It was determined that the presence of existing source soils needed to be investigated and removed, which it was.

In the case of the free product, being able to see the interaction of groundwater elevations and presence of absence of free product is also important. One often overlooked but equally important feature is the intersection of the groundwater
Graph C
interface within the well's screened interval. If the screen is submerged completely beneath the groundwater interface, free product cannot flow into the well, giving the unaware project manager the impression that free product is not present at this well location. In the Graph C, the black line represents the top of the well, the orange line represents the bottom of the well and the blue line represents the groundwater elevation. The only opportunity for product to flow into this well is on 11-21-2008, when water elevation was beneath the top of the screened interval. In the event that this well had free product, you would see a red line on top of the blue water table elevation line. These types of graphs are quick and effective at illustrating information like this.

If you have high contaminant concentrations in groundwater, you may have free product in your source soils, but if your screens are too deep, then you will not see that free product and may actually think you don't have any. I've seen this mistake many, many times. These types of visual graphs quickly and effectively illustrate whether your screen placement prevents the identification of the presence of free product. Additionally, the correlation of groundwater elevation, up or down, and the presence of free product can also be quickly identified.

Good graphics allow a project manager to not only determine correlations and thereby conclude causation; it also allows a good project manager to convey that information and conclusions in a simple fashion to a client or a regulator.

All these graphics can be generated using existing site data in about 3-5 seconds using SampleServe's groundwater project management application. Using traditional methods such as Excel would take several hours.

To learn more about our groundwater project management application and how you can use it for your own projects go to SampleServe.com.

- Russell Schindler

Sunday, March 2, 2014

Game Changer - Remediation May Be Over Rated


"You can't help but... with 20/20 hindsight, go back and say,  
Look, had we done something different, we probably  
wouldn't be facing what we are facing today."
- Norman Schwarzkopf

Selecting a remedial approach can be a complicated task. The process is based on identifying desired clean-up goals, sensitive issues that could affect a remediation strategy (e.g. proximity to drinking water wells, surface waters, other sensitive receptors or public perception/relations), and costs. Each site presents different decisions and uncertainties regarding remediation options. However, the process of deciding which corrective action to take should be relatively the same at each site. The four main steps are 1) delineation, 2) risk evaluation, 3) feasibility, and 4) cost.  The individual parameters and details of each step will differ for each site, but the process will be the same.The companies that understand the process and also work to keep it simple end up selecting the more effective and least costly solutions.

When estimating remedial action costs among proposed alternatives, the entire cost of each remedial action alternative should be included costs for mobilization, equipment, treatment, disposal, site restoration, energy, on-going monitoring, periodic regulatory reporting, and operation and maintenance. Costs should be limited to those incurred by the party implementing the remedy and should not include costs associated with regulatory agencies or any perceived property value impact.Estimates should provide for a relative comparison of costs between all the feasible alternatives. Alternatives that are not feasible should not be included. These estimated costs should be detailed and accurate, typically within 30 percent to 50 percent (plus or minus) of actual cost if the alternative were to be implemented.

Costs that will be incurred in the future, as part of the remedial action, should be identified and noted for the year in which they will occur. The distribution of costs over time often will be a critical factor in making tradeoffs between capital intensive technologies (a remediation system now) and less capital-intensive technologies (long-term monitored natural attenuation). When estimating future costs, a discount rate is used to account for the "time value of money", usually between 3 percent and 7.5 percent per year.In other words, a dollar spent today is worth more than a dollar spent three years from now.This will vary depending on assumptions regarding inflation/deflation.

A net present value andfuture worth analysis must be performed to estimate the current value of future costs. Future costs are discounted relative to the current year. This allows the cost of remedial action alternatives to be compared on the basis of a single figure in the current year. One way to look at it is if today you put that money in a theoretical bank account that paid interest over time, and the original amount plus the interest it would earn would be sufficient to cover all future costs of the remedial action alternative. By using the "net present value" evaluation, you're basically reverse estimating what your interest rate would be on the money in this theoretical bank account.

The "Game Changer" point I want to make regarding the remedial alternative selection process above is that the net present value "discount rate" used by most remediation firms is in most cases much too low. The discount rate range mentioned above doesn't account for the ever increasing rate of technological advancements related to remedial action.  As with all technology, costs are decreasing. Clean-up technologies are improving and cost to install the typical "off the shelf" remediation system is lower than it was 10 years ago, especially when accounting for inflation. Risk assessment science and exposure assumptions are, for the most part, improving as the science of risk assessment improves. This area of science can be swayed somewhat by shifting political winds but the general trend is towards allowing for more cost effective monitored solutions.Monitoring well sampling and laboratory analysis costs are on the decline because of the advent of third party sampling companies and advances in laboratory technologies. The price for a BTEX analysis is roughly 30 to 40 percent less now than we paid in the '90s, in straight dollars, without adjusting for inflation. Adjust for inflation and its well less than half of what was being paid in the mid '90s. Regulatory reporting costs from monitoring and O&M activities are on the decline because of intelligent software and reporting automation. Just as sure as the price of a computer or a cell phone has dropped over time (while the function and power of the devices has improved), so has environmental remediation science and technology.

Many feasibility assessments then and now are flawed in their long-term costs assumptions. Thelong-term costs associated with limited remediation and/or monitored natural attenuation are typically exaggerated. Conversely, remediation system time frames, system mechanical reliability and costs to operate are often understated. Waiting for environmental science to continue to advance while watching Mother Nature do what she does, is often a perfectly viable (and least costly) remedial alternative.

The corrective action feasibility analysis described above doesn't have to be a one-time process either.For long-term ongoing projects and even projects in the midst of on-going remediation, a reevaluation of the remedial alternatives and the "time value of money" costs associated with each option should be re-conducted on a periodic basis. Just because you have spent money on a system doesn't mean you should continue spending money on a system, especially if it's operating below its performance projections and above its proposed costs. In that event, it might be the perfect time for a re-evaluation. The outcome of the re-evaluation could be to "re-affirm" the continued operation of the system.Making the affirmative choice to continue is more desirable than having it just happen by default.

If you would like to talk about how we can precisely dial in your long term monitoring costs with a long term contract, I can be reached atschindler@sampleserve.com or at (231) 218-7955.
-- Russell

12-Volt Stainless Steel Centrifugal Pumps - A Comparison

 
"Beware of little expenses; a small leak will sink a great ship."
-- Benjamin Franklin

One of our specialties at SampleServe.com is sampling groundwater. As such, we have used a variety of types of pumps in a multitude of makes and models. In this article I am going to tell you about two 12-volt stainless steel centrifugal pumps I've used and explain why I like one over the other.

The two pumps I am reviewing are the Geotech SS Geosub 12 VDC Sampling Pump and the Proactive Stainless Steel Hurricane® XL.  I am not aware of any other makers of "stainless steel" environmental 12-volt pumps sold in the US. There are plastic 12-volt pumps; however, plastic pumps are not part of my evaluation.

I've had the opportunity to use both of these pumps at length. I used theGeosub Pump recently on a large project in Ohio and sampled approximately 30 - 40 wells with the pump. The pump performed fine without any maintenance or operational problems. It was easy to decontaminate and appeared well built and sturdy.  It is slightly larger than the Hurricane XL and came with 200' of wire and was rated as being able to pump water from up to 200' below grade.  Water at the site was in the range of 30' to 40' below grade so we never tested the maximum depth on this pump.

One of the issues, or draw backs I had with the pump was the Geotech SS Geosub Controller that is used with the Geosub Pump. To operate the pump in the field, we had to have a car battery connected to a 12-volt AC inverter which was connected to the 110-volt power cord of the Geosub Controller convertor, which then powers it back down to the 12-volt pumps operating range. It seemed like an unnecessary and energy consuming step. 12-volts, up to 110-volts, back down to 12-volts.

Although the controller was mounted in a pelican case and was sturdy, it was also overly large and heavy.  One final issue with the controller was during initial pump start up, the pumps digital default speed is set at full speed.  Upon starting the pump, after a few wells, we learned to immediately throttle the pump down or deal with the rush or water. While sampling, the controller was able to dial down to very low flow rates and was also able to maintain precise flow rates without any noticeable flow rate fluctuation.

The Geosub Pump and the Geosub Controller are sold separately. I was recently quoted a price of $2,087.00 for the Geosub Pump and $2,195.00 for the Geosub Controller, for a total price of $4,282.00.  You will also have to purchase a 12-volt AC invertor for approximately $80.00.

The Hurricane XL is an updated version of the older Hurricane pump of which I own 2. The difference between the old version and the new version are the following.  The new version has a rounded top which minimizes well hang ups when pulling the pump out of the well. The easily replaceable motor has also been redesigned. I have not used the XL, however the differences between the versions are inconsequential in my critique of the pump.

I've used the Hurricane extensively and have sampled hundreds of wells using the pump.The pump has always performed fine with minimal maintenance or operational problems. The pump comes with a field replaceable motor the can be exchanged in seconds. The motors cost approximately $250.00 apiece and if maintained properly can last for a couple hundred wells. The pumps are easy to decontaminate and are well built and simple.The Hurricane XL comes with 150' of wire and is rated as being able to pump water from up to 150' below grade. I have personally used the pump at sites with a depth of water of 85' and had no problem maintaining decent flow. I have not used the pump at a site with water greater than 85' below grade so I've never tested the maximum depth on this pump.

The Proactive Low Flow Controller With Power Booster 2.5 XL "LCD" is straight forward and simple to use. You simply connect the controller to a 12-volt car battery and the other end to the pump. A dial controls the flow with the voltage output to the pump displayed on the digital screen. The controller is slightly smaller than a six-pack of beer (cans), mounted in its own metal case and is light, sturdy and durable.

The controller is able to dial down to very low flow rates, however the controller periodically mysteriously fluctuates flow up and down by 10% to 20% sometimes making maintaining a precise flow rate difficult. Having clean good battery connections minimizes this problem.

The Hurricane XL Pump and the 2.5 XL "LCD" Controllers are sold separately. I was recently quoted a price of $1,645.00 for the Hurricane XL Pump and $860.00 for the 2.5 XL "LCD" Controller, for a total price of $2,505.00.  No 12-volt AC inverter needed.

Both pumps performed for the tasks required, however, my favored pump based on price and simplicity of use is the Hurricane XL. It's less than 58% of the cost of the Geosub and is less cumbersome and simpler to use. I am not affiliated with either of these companies other than I have bought 2 Proactive Hurricanes pumps and rented the Geosub. 

If you have a opinion different than mine, I would love to hear about it. I can be reached at schindler@sampleserve.com or at 231-218-7955.

-- Russell