I have spent a fair bit of time thinking about how citizen scientists can measure the health of their local streams using Macroinvertebrate Community Index (WIMP & SHMAK) and the index of biotic integrity (IBI) for New Zealand fish. The great thing about using stream life to measure stream health is that the animals act as 24/7 sensors that measure any of the countless pollutants that harm life. The problem with it is that finding and more importantly identifying species involves disrupting them.
eDNA (environmental DNA) sampling solves this by measuring the presence of stream life base on the tiny fragments all life constantly erode into water. Wilderlab have set up a testing system with relatively cheap kits available for citizen scientists. I found it easy to use on my local stream (which is very degraded). I am really excited about this technology, especially as the price comes down and results are benchmarked against existing stream health Indices.
Community groups and scientists have begun ‘Kelp Gardening’ in the Hauraki Gulf. The activity involves removing kina (se urchins) from rocky reefs to allow the kelp to regrow. It gives divers and snorkelers something to do on a reef (where they no longer have fish to hunt) and fits nicely with New Zealand pest management ethos (suppression).
However kina are not a pest, their numbers are artificially inflated by overfishing. See diagram of mine from the State of the Gulf report below.
Kina barrens are created by overfishing
Kelp gardening differs wildly from other active restoration techniques as it replaces a natural function with a human intervention. The activity may have a place in the creation of a marine protected area but it is not a smart long term reef management technique because it treats the symptom not the cause of a sick reef.
So why are well respected scientists and well intentioned volunteers doing it? I really think it’s just because calls for marine protection have fallen on deaf ears, and some people are so desperate to fix things they will try anything.
UPDATE: 11 March 2021: In a public seminar today Dr Nick Shears who is an expert on kina barrens in New Zealand said “kina removal can be incorporated into restoration / management but we want to make it clear it is not the answer on its own!“
I was inspired to make this illustration of the now extinct North Island Snipe by this short paper of Mike Lee’s questioning their prehistoric mainland extinction. I have just finished reading his book Navigators and Naturalists: French Exploration of New Zealand and the South Seas (1769-1824)where he tells the stories of the first French explorers to New Zealand. In 1820 Captain Richard Cruise recorded shooting a snipe on Motukorea (Browns Island) where I monitor shorebirds now.
Māori introduced the Pacific rat kiore, which is thought to be the main driver in the extinction of the North Island Snipe. I have never seen a kiore (that I know of). It was interesting to read how the small rats were an important food source for Māori and that some coveted the larger rats bought by the Europeans.
The illustration was difficult because the only specimen of the North Island snipe has deteriorated. Colin Miskelly provided invaluable advice and corrections, I learnt a lot in the process. I’m very proud to have the image published on New Zealand Birds Online.
I am working with the Auckland Council mowing team to manage the grass for Northern New Zealand dotterel which are a conservation dependant species. In the Tāmaki Estaury there are no significant beaches, so the dotterel nest in the grass at Point England.
We need to mow the grass so that:
Dotterel can walk in it
Dotterel can see predators coming
We don’t provide food for predators (rats and pukeko)
However we can’t mow near the chicks and the chicks also need cover from predators. I did ask the grass to stop growing at 10cm but it did not listen to me 😀
We currently have at least two chicks who are always seen with 2-6 adults near the old nest. I considered multiple mowing strategies and decided to trial a moat around the edge of the paddock to exclude agoraphobic predators (rats and cats).
So far the chicks are staying in the un-mown area, adults come and go but generally prefer the mown areas for roosting. I have used a line trimmer to create a shorter grass area in the centre which the adults keep their chicks near.
< 48hr old chick with egg tooth visible
UPDATE: 2 December
I was surprised to find a new three egg nest inside mown moat. Unfortunately it was abandoned on the 11th of December. All three chicks did not fledge and were not seen outside the un-mown area. They looked to prefer the areas where the buckhorn plantain seed heads were less abundant and there was more buttercup.
This artificial shag roost in Hobson Bay was constructed by Auckland Council to mitigate the effects of a boardwalk being constructed next to an existing shag roost. Nesting materials and a plywood decoy has been added but the birds are yet to show interest in it.
Capture view (hide cameras and other visual aids first then export .PNG at a ridiculous resolution)
ArgiSoft Metashape worked much better than using Adobes photo stitching software (Photoshop & Lightroom) on the same data. But I need more overlapping images as all the software packages were not able to match all of any of the four test sequences I did.
I’m going to test shooting in video next. The frames will be smaller 2704×1520 (if I stick with linear to avoid extra processing for lens distortion) instead of 4000×3000 with the time-lapse but I’m hopping all the extra frames will more than compensate (2FPS=>24FPS).
In theory an ROV will be better but I don’t think there are any on the market that know where they are based on what they can see. All the work arounds for knowing where you are underwater are expensive, here are two UWIS & Nimrod. I want to see if we can do this with divers and no location data. I don’t think towing a GPS will be accurate enough to match up the photos but it does seem to work with drones taking images of bigger scenes (I want this to work in 50cm visibility). I expect if I want large complete images the diver will need to follow another diver who has left a line on the seafloor. One advantage of this is that the line could have a scale on it, but I’m hoping to avoid it as the lines will be ugly 😀 So far I can do only two turns before it fails. There are three patterns that might work (Space invaders, Spirals and Pick up sticks). For my initial trials I am focusing on Space invaders.
Video provides lots more frames and the conversion is easy. A land based test with GPS disabled, multiple turns, 2500 photos, 2704 x 2028 linear, space invader pattern at 1.2m from the ground worked perfectly. However I cant get it to work underwater. In every test so far Metashape will only align 50-100 frames. I tried shooting on a sunny day which was terrible as the reflection of the waves dancing on the seafloor confuses the software. But two follow up shoots also failed, when I look at the frames Metashape cant match I just don’t see why its can’t align them. Theses two images are in sequence, one gets aligned and the next one is dropped!
Here is what the test footage looks like, I have increased the contrast.
I have also tried exporting the frames at 8fps to see if the alignment errors are happening because the images are too similar but got similar results (faster).
Detailed advice from Metashape:
Since you are using Sequential pre-selection, you wouldn’t get matching points for the images from the different lines of “space invader” or “pick up sticks” scenarios or from different radius of “spiral” scenario.
If you are using “space invader” scenario and have hundreds or thousands of images, it may be reasonable to align the data in two iterations: with sequential preselection and then with estimated preselection, providing that most of the cameras are properly aligned.
As for the mesh reconstruction – using Sparse Cloud source would give you very rough model, so you may consider building the model from the depth maps with medium/high quality in Arbitrary mode. As for the texture reconstruction, I can suggest to generate it in Generic mode and then in the model view switch the view from Perspective to Orthographic, then orient the viewpoint in the desired way and use Capture View option get a kind of planar orthomosaic projection for your model.
Align ‘sequential’ only ever gets about 5% of the shots. Repeating the alignment procedure on ‘estimated’ picks up the rest but the camera alignment gets curved. I think I have calibrated the cameras to 24mm (it’s hard to see if that has been applied) but it doesn’t seem to change things.
I tried an above water test and made a two minute video of the Māori fish dams at Tahuna Torea. I used the same settings as above, but dropped the quality down to medium. It looks great!
The differences between above and below water are: Camera distance to subject, flotsam, visibility / image quality and colour range. If the footage I am gathering is too poor for Metashape to align it might mean we need less suspended sediment in the water to make the images. That’s a problem as the places I want to map are suffering from suspended sediment – which is why they would benefit from shellfish restoration.
The Agisoft support team are awesome. They processed my footage with f = 1906 in the camera calibration, align photos without using preselection and a 10,00 tie point limit. The alignment took 2.5 days but worked perfectly (click on the image below). There are a few glitches but I think the result is good enough for mapping life on the seafloor. I will refine the numbers a bit and post them in a seperate blog post, wahoo!
Here is my photogrammetry process / settings for GoPro underwater. I am updating them as I learn more. Please let me know if you discover an improvement. Thanks to Vanessa Maitland for her help refining the process.
Here is an example image made using the process, the area is about 8m in diameter, click on the image to see it full size.
Step 1: Make a video of the seafloor using a GoPro
If you’re at less than 5m deep you will need to go on a cloudy day for even light
Make sure you shoot 4k, Linear. Also make sure the orientation is locked in preferences.
Record you location and the direction your going to swim if you want to put your orthomosaic on a map
Swim in a spiral using a line attached to a peg to keep the distance from it even
Don’t leave any gaps or you will generate a spherical image
Step 2: Edit your video
There are lots of software packages that will do this, I use Adobe After FX where I can increase the contrast in the footage and add more red light depending on the depth. You might also find it easier to trim your video here but you can also do it in Step 5.
Step 3: Install and launch Agisoft Metashape Standard.
These instructions are for version 1.8.1
Step 4: Use GPU.
In ‘Preferences’, ‘GPU’ select your GPU if you have one, I also checked ‘Use CPU when performing accelerated processing’
Step 5: Import video
From the menu chose: ‘File’, ‘Import’, ‘Import Video’.
The main setting to play with here is the ‘Frame Step’ We have had success with using every third frame which cuts down on processing time.
If you have multiple videos you will have to import multiple chunks, then I recommend combining them before processing using ‘Workflow, Merge Chunks’, I have had better results doing this, rather than processing each chunk individually then choosing ‘Workflow, Align Chunks’.
Step 6: Camera calibration
From the menu chose: ‘Tools’, ‘Camera calibration’
Change ‘Type’ from ‘Auto’ to ‘Precalibrated’
Change the ‘f’ value to 1906
Step 7: Align photos
From the menu chose: ‘Workflow’, ‘Align Photos’
Check ‘Generic preselection’
Uncheck ‘Reference preselection’
Uncheck ‘Reset current alignment’
Select ‘High’ & ‘Estimated’ from the dropdown boxes Under ‘Advanced’ chose:
‘Key point limit’ value to 40,000
‘Tie point limit’ value to 10,000
Uncheck ‘Guided image matching’ & ‘Adaptive camera model fitting’
Leave ‘Exclude stationary tie points’ checked
If 100% of cameras are not aligned then try Step 8 otherwise skip to Step 9.
Step 8: Align photos (again)
From the menu chose: ‘Workflow’, ‘Align Photos’
Uncheck ‘Generic preselection’
Check ‘Reference preselection’
Uncheck ‘Reset current alignment’
Select ‘High’ & ‘Sequential’ from the dropdown boxes Under ‘Advanced’ chose:
‘Key point limit’ value to 40,000
‘Tie point limit’ value to 4,000
Uncheck ‘Guided image matching’ & ‘Adaptive camera model fitting’
Leave ‘Exclude stationary tie points’ checked
Now all the photos should be aligned, if not repeat step 7 & 8 with higher settings and check ‘Reset current alignment’ on step 7 only. I have been happy with models that have 10% of photos not aligned.
Step 9: Tools / Optimize Camera Locations
Just check the check boxes below (default settings):
Fit f, Fit k1, Fit k2, Fit k3
Fit cx, cy, Fit p1, Fit p2
Leave the other checkboxes (including Advanced settings) unchecked.
Step 10: Resize region Use the region editing tools in the graphical menu make sure that the region covers all the photos you want to turn into a 3D mesh. You can change what is being displayed in the viewport under ‘Model’, ‘Show/Hide Items’.
Step 11: Build dense cloud
Quality ‘High’
Depth filtering ‘Moderate’
Uncheck all other boxes.
Step 12: Build mesh
From the menu choose: ‘Workflow’, ‘Build Mesh’
Select ‘Dense cloud’, ‘Arbitrary’
Select ‘Face count’ ‘High’
Under ‘Advanced’ leave ‘Interpolation’ ‘Enabled’
Leave ‘Calculate vertex colours’ checked
Step 13: Build texture
From the menu choose: ‘Workflow’, ‘Build Texture’
Select ‘Diffuse map’, ‘Images’, ‘Generic, ‘Mosaic (default)’ from the dropdown menus.
The texture size should be edited to suit your output requirements. The default is ‘4096 x 1’
Under ‘Advanced’ Turn off ‘Enable hole filling’ & ‘Enable ghosting filter’ if you’re using the image for scientific rather than advocacy reasons.
Step 14: Export orthomosaic You can orientate the view using the tools in the graphical menu. Make sure the view is in orthographic before you export the image (5 on the keypad). Then chose ‘View’, ‘Capture View’ from the menu. The maximum pixel dimensions are 16,384 x 16,384. Alternatively you can export the texture.
==
Let me know if you have experimented converting timelaspe / hyperlapse video to photogrammetry. There may be some advantages.
As communities get increasingly worried about the declining quality of their waterways there is more interest stream health assessments. I am a huge fan of the Waicare Invertebrate Monitoring Protocol (WIMP) which is simple enough that school students can use it. However the Waicare programme has been largely defunded by Auckland Council and there is no way for the public to share WIMP data. NIWA and Federated Farmers of New Zealand have put together https://nzwatercitizens.co.nz/ based on the New Zealand Stream Health Monitoring and Assessment Kit (SHMAK). It is great but incredibly hard to use, the manual is horrific. I believe this is being addressed but will take years. To help, the science learning hub has made this great guide for teachers and students. NIWA have put together some videos. They are not published together anywhere online so I have posted the list below:
I really like this list of alternative words for environmental terms, offered by George Monbiot & Ralph Steadman. I have rebuilt it in HTML with some additions and deletions, I plan to evolve it over time.
Existing terms
What’s wrong with it
Alternative terms
Environment
Cold, technical Seen as seperate
The natural world
Global warming
Warm sounds pleasant
Global overheating
Biodiversity
Inaccessible
Wildlife
Ecosystem services
Anthropocentric and reductive
Life support systems
Nature reserve
‘reserve’ suggests coldness
Wildlife refuge
Habitat destruction Deforestation Biodiversity loss
Sounds like they are happening to themselves
Ecocide
Conservation
Preserving what little is left rather than rebuilding living systems (New Zealand needs a department of Restoration)
Restoration
Clean rivers / seas
Sounds too hygienic, is blind to waters as habitats
Thriving rivers / seas
Fossil fuels
Suggests redundancy
Dirty fuels
Sustainable development
Green growth is an oxymoron
Regenerative development
Photopollution
Too technical, doesn’t indicate what is being impacted
Ecological light pollution
Stormwater
Suggests that the water is unwanted, unnecessary or unsavoury
Rainwater
MARINE
Fish stocks
Suggests fish are here to serve us
Wild fish populations
Biofoul, foul, foul ground
Suggests there is something ugly about biogenic habitats
Sea life, seafloor life, benthic epifauna
Fishing
Casual everyday activity
Killing native wildlife
Bait fish
Implies the fish exist to be bait
Small schooling fish / forage fish / small pelagic fish / shoaling fish
Do you have any ideas or suggestions we should consider, as we draft the management plan for local parks in your local board area? Are there any ideas of suggestions you have for the use, enjoyment, protection, management and development for local parks in your area? Please include the name or location of the park/s, if possible.
Please include a network of safe places for shorebirds to roost and nest in parks adjacent to the Tāmaki Estuary. The areas that are important to shorebirds have been mapped in the attached document Shorebirds-of-the-Tamaki-Estuary-by-Shaun-Lee.pdf detailed evidence for the preservation of the shorebird roosting and nesting habitat can be found in PE-Development-Full-Evidence-Shaun Lee.pdf and PE-Development-Futher-Evidence-Shaun-Lee.pdf also attached. Since the Point England Development Enabling Bill has passed the stock have been removed from Point England. Despite our best efforts with mowing productivity has declined dramatically. I suggest we return the stock, feel free to contact me or Council biodiversity staff if you would like to see the breeding reports.
Can you tell us what you like about the park(s) in your local board area? Please include the name or location of the parks, if possible.
The roosting and nesting shorebirds. Numbers of shorebirds in the parks have greatly reduced over the last few decades and some species are now locally extinct. We need to do a better job of looking after them as some species are going extinct globally. The most important parks in the Maungakiekie-Tāmaki area are the Point England Reserve and Mt Wellington War Memorial Reserve.
Can you tell us what you don’t like about the park(s) in your local board area? Please include the name or location of the parks, if possible.
There is no space dedicated to shorebirds. This means they are regularly disturbed by golf practice, frisbee, casual ball play, jogging, walking, drone flying, kite-surfing, kite flying, picnicking, dog walking and much more. Disturbance in parks is increasing with population density. Key roosting and nesting areas are threatened with development. Minor things like paths, lighting and even trees can negatively impact shorebird habitat.
Is there anything else you’d like to tell us about parks in your area? Please share any other thoughts about local parks here.
Please make sure decision makers are aware of the status of threatened and conservation dependant species that use Maungakiekie-Tāmaki Local Parks, then they can make good decisions and start to restore habitats and associated abundance and diversity.
You can see that Fisheries New Zealand has no intention managing these ‘stocks’ to the target. Instead they manage populations to the limit where action is required. This is like driving a car at the speed limit the whole time and slamming your brakes on at the corners. Some people may drive like that, but New Zealand should have better attitudes to managing its native marine biodiversity.