Wader population trends at Tahuna Torea

While researching my latest nature report to the Tahuna Torea Residents and Rangers I found some great bird counts in the reserve made in the 1980’s that had been entered on eBird. Unfortunately an oversight in the design of the website means you need to know a statistical programming language to extract population trend data for a location (however if you are able to stumble across an old checklist you can download the data). At the meeting Chris Barfoot supplied me with a brilliant 1993 report on the reserve which had a new set of data recorded by Micheal Taylor.

This new data adds valuable insight into the decline of waders in the Tamaki Estuary which the Tamaki Estuary Environmental Forum has recently published an article on. I have compiled the data and plotted it for key species below.

UPDATE 23 JUNE 2021. Here is an interesting snippet of history complied by the Howick & Pakuranga Times “Kuaka [bar tailed godwits] and red knots gather on the Cockle Bay estuary in thousands before flying to Korea and on to Alaska to breed each March, to return in September. In February-March they swoop over Point View ridges where farmers used shotguns before the birds were protected in 1941. They were plucked and preserved in their fat in ceramic jars.” Source: https://www.birdingnz.net/forum/viewtopic.php?f=3&t=11024

UPDATE August 2021. More data found in old journals. Graphs updated and presented to the Ōrākei Local Board.

Feeding our forests

I have begun doing some work with the Northern New Zealand Seabird Trust who invited me to come and help them with some field work on the Poor Knights Islands. My father had visited the Islands when he worked for DOC in the 1990’s, his stories about the reptile abundance really inspired me to do restoration work, and I jumped at the opportunity to go.

Landing on the Island is notoriously difficult and our first shot at it was delayed, we had to go back to Auckland to wait for better weather. The islands are surrounded by steep cliffs that made European habitation impractical, Māori left the area in the 1820’s. This means the island I visited has never had introduced mammals, not even kiore! I spent days cleaning my gear to get through the biosecurity requirements which are incredibly strict for good reason.

I have explored a few predator free islands including Hauturu / Little Barrier Island which has been described as New Zealand’s most intact ecosystem. However it was only cleared of rats in 2004. When I am photographing invertebrates at night in mainland sanctuaries or forests with predator control (like Tāwharanui Regional Park or parts of the Waitakere Ranges) I see one reptile every eight hours or so. On Hauturu / Little Barrier Island I see them every 20 minutes, but on the Poor Knights it was every two minutes! Bushbird numbers were lower than other islands, I expect this is because reptiles and birds compete over prey species. I wonder if reptile numbers on other islands might be slower to recover because they are preyed on by bushbirds. I reckon that the Poor Knights total reptile and bushbird biomass is much greater than the restored islands I have visited. One reason for this is that reptiles use less energy to hunt than bushbirds but the other reason might be because it has more seabirds.

While walking through the bush at night I would sometimes hear a crashing in the canopy followed by a soft thump on the ground. In an incredible navigational feat the seabirds somehow land only meters from their burrows. At night I heard Buller’s shearwater, grey-faced petrel, little penguins and diving petrel (fairy prion finish breeding in February). While monitoring birds at night I was showered with dirt by a Buller’s shearwater who was digging out a burrow. In my short time on the island I saw cave weta and three species of reptile using the burrows. Like a rock forest the burrows add another layer of habitat to the ecosystem. It was incredibly touching to see the care and compassion the researchers had for some of the chicks who were starving while waiting for their parents who often have to travel hundreds of kilometres to find enough food. The chicks who don’t make it die in their burrows and are eaten by many invertebrates, the invertebrates in turn become reptile or bushbird food. The soil on the island looked thick and rich, when it rains nutrients are bought down into the small but famous marine reserve which is teaming with life.

I was only on the island for three nights but I was very fortunate to experience a pristine ridge to reef ecosystem. Seabirds are incredible ecosystem engineers who were an integral part of New Zealand’s inland forests for millions of years. Communities are making small efforts to bring seabirds back to predator free island and mainland sites with no control over seabird food sources. If we really want intact ecosystems we will have to make sure our oceans have enough food for seabirds to feed our forests.

Pakiri horse mussel beds

For the last eight years I have been working on restoring green-lipped mussel (perna canaliculus) beds in the Hauraki Gulf. The main reason we started with that species was that there is a commercial supply. However there is a bigger native mussel that has even more potential than green-lipped mussels, both as a habitat and water cleaner. 

Horse mussels / Hururoa (atrina zelandica) are huge pumps, they are more than twice as big as green-lipped mussels (up to 400 mm in length) but have fragile shells which are vulnerable to fishing gear. They also  don’t move (unliked green-lipped mussels) and are sensitive to changes in substrate.

Last century horse mussel beds were some of the best fishing spots in the Hauraki Gulf. I regularly ask divers about horse mussels; Where did you see them? How dense was the bed? etc. There are spots with horse mussel in the Gulf but I don’t know of any significant beds left, if you do, please let me know about them.

I was recently sent these images of a stunning horse mussel bed in Pakiri.  They are from a report titled “Mangawhai – Pakiri Sand Study, Module 2: Technical Report, Marine Sands” by NIWA 1996. They show a large and dense horse mussel bed that has since been destroyed by sand mining. The beds ran the whole length of the embayment in depths of 15 -20m. I am posting them here to show the kind of seafloor we could have, if we treated it better.

We don’t yet understand the horse mussel lifecycle or what species / substates might attract juveniles. It’s interesting to note: presence of finger sponge and branching red algae, the hard edge to the bed and the way some of the shells align.

Overseas there have been attempts at restoring similar species (photos below). Seachange called for the “Initiation of a horse mussel restoration programme, with an initial focus on the Mahurangi and Whangapoua harbours.” But we could look at restoring any of the sites with historic beds. Wouldn’t it be awesome if we stopped smashing the seafloor, and bought back these giant pumps to clean the water, and create homes for fish.

Mapping life on the seafloor

I’m experimenting with mapping the seafloor for restoration projects. This is what I have done so far:

  1. Record a time-lapse sequence, swim as slow as possible (GoPro Hero 7 Black linear 0.5sec)
  2. Buy ArgiSoft Metashape Standard (about $300NZD)
  3. Import photos to chunk
  4. Align photos (Duplication errors on highest, high best, Reference preselection [sequential], Guided image matching)
  5. Don’t clean up the point cloud (not needed for orthomosaics)
  6. Build mesh (use sparse cloud source data, Arbitrary 3D)
  7. Build texture (Adaptive orthophoto, Mosaic, Enable hole filling, Enable ghosting filter)
  8. Capture view (hide cameras and other visual aids first then export .PNG at a ridiculous resolution)

ArgiSoft Metashape worked much better than using Adobes photo stitching software (Photoshop & Lightroom) on the same data. But I need more overlapping images as all the software packages were not able to match all of any of the four test sequences I did.

I’m going to test shooting in video next. The frames will be smaller 2704×1520 (if I stick with linear to avoid extra processing for lens distortion) instead of 4000×3000 with the time-lapse but I’m hopping all the extra frames will more than compensate (2FPS=>24FPS).

In theory an ROV will be better but I don’t think there are any on the market that know where they are based on what they can see. All the work arounds for knowing where you are underwater are expensive, here are two UWIS & Nimrod. I want to see if we can do this with divers and no location data. I don’t think towing a GPS will be accurate enough to match up the photos but it does seem to work with drones taking images of bigger scenes (I want this to work in 50cm visibility). I expect if I want large complete images the diver will need to follow another diver who has left a line on the seafloor. One advantage of this is that the line could have a scale on it, but I’m hoping to avoid it as the lines will be ugly 😀 So far I can do only two turns before it fails. There are three patterns that might work (Space invaders, Spirals and Pick up sticks). For my initial trials I am focusing on Space invaders.

Video provides lots more frames and the conversion is easy. A land based test with GPS disabled, multiple turns, 2500 photos, 2704 x 2028 linear, space invader pattern at 1.2m from the ground worked perfectly. However I cant get it to work underwater. In every test so far Metashape will only align 50-100 frames. I tried shooting on a sunny day which was terrible as the reflection of the waves dancing on the seafloor confuses the software. But two follow up shoots also failed, when I look at the frames Metashape cant match I just don’t see why its can’t align them. Theses two images are in sequence, one gets aligned and the next one is dropped!

Here is what the test footage looks like, I have increased the contrast.

I have also tried exporting the frames at 8fps to see if the alignment errors are happening because the images are too similar but got similar results (faster).

Detailed advice from Metashape:

Since you are using Sequential pre-selection, you wouldn’t get matching points for the images from the different lines of “space invader” or “pick up sticks” scenarios or from different radius of “spiral” scenario.

If you are using “space invader” scenario and have hundreds or thousands of images, it may be reasonable to align the data in two iterations: with sequential preselection and then with estimated preselection, providing that most of the cameras are properly aligned.

As for the mesh reconstruction – using Sparse Cloud source would give you very rough model, so you may consider building the model from the depth maps with medium/high quality in Arbitrary mode. As for the texture reconstruction, I can suggest to generate it in Generic mode and then in the model view switch the view from Perspective to Orthographic, then orient the viewpoint in the desired way and use Capture View option get a kind of planar orthomosaic projection for your model.

Align ‘sequential’ only ever gets about 5% of the shots. Repeating the alignment procedure on ‘estimated’ picks up the rest but the camera alignment gets curved. I think I have calibrated the cameras to 24mm (it’s hard to see if that has been applied) but it doesn’t seem to change things.

I tried an above water test and made a two minute video of the Māori fish dams at Tahuna Torea. I used the same settings as above, but dropped the quality down to medium. It looks great!

The differences between above and below water are: Camera distance to subject, flotsam, visibility / image quality and colour range. If the footage I am gathering is too poor for Metashape to align it might mean we need less suspended sediment in the water to make the images. That’s a problem as the places I want to map are suffering from suspended sediment – which is why they would benefit from shellfish restoration.

The Agisoft support team are awesome. They processed my footage with f = 1906 in the camera calibration, align photos without using preselection and a 10,00 tie point limit. The alignment took 2.5 days but worked perfectly (click on the image below). There are a few glitches but I think the result is good enough for mapping life on the seafloor. I will refine the numbers a bit and post them in a seperate blog post, wahoo!

Update Jul 2022: Great paper explaining the process with more sophisticated hardware


Final method

Here is my photogrammetry process / settings for GoPro underwater. I am updating them as I learn more. Please let me know if you discover an improvement. Thanks to Vanessa Maitland for her help refining the process.

Here is an example image made using the process, the area is about 8m in diameter, click on the image to see it full size.

Step 1: Make a video of the seafloor using a GoPro

  • If you’re at less than 5m deep you will need to go on a cloudy day for even light
  • Make sure you shoot 4k, Linear. Also make sure the orientation is locked in preferences.
  • Record you location and the direction your going to swim if you want to put your orthomosaic on a map
  • Swim in a spiral using a line attached to a peg to keep the distance from it even
  • Don’t leave any gaps or you will generate a spherical image

Step 2: Edit your video

  • There are lots of software packages that will do this, I use Adobe After FX where I can increase the contrast in the footage and add more red light depending on the depth. You might also find it easier to trim your video here but you can also do it in Step 5.

Step 3: Install and launch Agisoft Metashape Standard.

  • These instructions are for version 1.8.1

Step 4: Use GPU.

  • In ‘Preferences’, ‘GPU’ select your GPU if you have one, I also checked ‘Use CPU when performing accelerated processing’

Step 5: Import video

  • From the menu chose: ‘File’, ‘Import’, ‘Import Video’.
  • The main setting to play with here is the ‘Frame Step’ We have had success with using every third frame which cuts down on processing time.
  • If you have multiple videos you will have to import multiple chunks, then I recommend combining them before processing using ‘Workflow, Merge Chunks’, I have had better results doing this, rather than processing each chunk individually then choosing ‘Workflow, Align Chunks’.

Step 6: Camera calibration

  • From the menu chose: ‘Tools’, ‘Camera calibration’
  • Change ‘Type’ from ‘Auto’ to ‘Precalibrated’
  • Change the ‘f’ value to 1906

Step 7: Align photos

  • From the menu chose: ‘Workflow’, ‘Align Photos’
  • Check ‘Generic preselection’
  • Uncheck ‘Reference preselection’
  • Uncheck ‘Reset current alignment’
  • Select ‘High’ & ‘Estimated’ from the dropdown boxes
    Under ‘Advanced’ chose:
  • ‘Key point limit’ value to 40,000
  • ‘Tie point limit’ value to 10,000
  • Uncheck ‘Guided image matching’ & ‘Adaptive camera model fitting’
  • Leave ‘Exclude stationary tie points’ checked

If 100% of cameras are not aligned then try Step 8 otherwise skip to Step 9.

Step 8: Align photos (again)

  • From the menu chose: ‘Workflow’, ‘Align Photos’
  • Uncheck ‘Generic preselection’
  • Check ‘Reference preselection’
  • Uncheck ‘Reset current alignment’
  • Select ‘High’ & ‘Sequential’ from the dropdown boxes
    Under ‘Advanced’ chose:
  • ‘Key point limit’ value to 40,000
  • ‘Tie point limit’ value to 4,000
  • Uncheck ‘Guided image matching’ & ‘Adaptive camera model fitting’
  • Leave ‘Exclude stationary tie points’ checked

Now all the photos should be aligned, if not repeat step 7 & 8 with higher settings and check ‘Reset current alignment’ on step 7 only. I have been happy with models that have 10% of photos not aligned.

Step 9: Tools / Optimize Camera Locations

Just check the check boxes below (default settings):

  • Fit f, Fit k1, Fit k2, Fit k3
  • Fit cx, cy, Fit p1, Fit p2

 Leave the other checkboxes (including Advanced settings) unchecked.

Step 10: Resize region
Use the region editing tools in the graphical menu make sure that the region covers all the photos you want to turn into a 3D mesh. You can change what is being displayed in the viewport under ‘Model’, ‘Show/Hide Items’.

Step 11: Build dense cloud

  • Quality ‘High’
  • Depth filtering ‘Moderate’
  • Uncheck all other boxes.

Step 12: Build mesh

  • From the menu choose: ‘Workflow’, ‘Build Mesh’
  • Select ‘Dense cloud’, ‘Arbitrary’
  • Select ‘Face count’ ‘High’
  • Under ‘Advanced’ leave ‘Interpolation’ ‘Enabled’
  • Leave ‘Calculate vertex colours’ checked

Step 13: Build texture

  • From the menu choose: ‘Workflow’, ‘Build Texture’
  • Select ‘Diffuse map’, ‘Images’, ‘Generic, ‘Mosaic (default)’ from the dropdown menus.
  • The texture size should be edited to suit your output requirements. The default is ‘4096 x 1’
  • Under ‘Advanced’ Turn off ‘Enable hole filling’ & ‘Enable ghosting filter’ if you’re using the image for scientific rather than advocacy reasons.

Step 14: Export orthomosaic
You can orientate the view using the tools in the graphical menu. Make sure the view is in orthographic before you export the image (5 on the keypad). Then chose ‘View’, ‘Capture View’ from the menu. The maximum pixel dimensions are 16,384 x 16,384. Alternatively you can export the texture.

==

Let me know if you have experimented converting timelaspe / hyperlapse video to photogrammetry. There may be some advantages.

Stream Health Monitoring guides

As communities get increasingly worried about the declining quality of their waterways there is more interest stream health assessments. I am a huge fan of the Waicare Invertebrate Monitoring Protocol (WIMP) which is simple enough that school students can use it. However the Waicare programme has been largely defunded by Auckland Council and there is no way for the public to share WIMP data. NIWA and Federated Farmers of New Zealand have put together https://nzwatercitizens.co.nz/ based on the New Zealand Stream Health Monitoring and Assessment Kit (SHMAK). It is great but incredibly hard to use, the manual is horrific. I believe this is being addressed but will take years. To help, the science learning hub has made this great guide for teachers and students. NIWA have put together some videos. They are not published together anywhere online so I have posted the list below:

Alternative words for environmental terms

I really like this list of alternative words for environmental terms, offered by George Monbiot & Ralph Steadman. I have rebuilt it in HTML with some additions and deletions, I plan to evolve it over time.

Existing termsWhat’s wrong with itAlternative terms
EnvironmentCold, technical
Seen as seperate
The natural world
Global warmingWarm sounds pleasantGlobal overheating
BiodiversityInaccessibleWildlife
Ecosystem servicesAnthropocentric and reductiveLife support systems
Nature reserve‘reserve’ suggests coldnessWildlife refuge
Habitat destruction
Deforestation
Biodiversity loss
Sounds like they are happening to themselvesEcocide
ConservationPreserving what little is left rather than rebuilding living systems (New Zealand needs a department of Restoration)Restoration
Clean rivers / seasSounds too hygienic, is blind to waters as habitatsThriving rivers / seas
Fossil fuelsSuggests redundancyDirty fuels
Sustainable developmentGreen growth is an oxymoronRegenerative development
PhotopollutionToo technical, doesn’t indicate what is being impactedEcological light pollution
StormwaterSuggests that the water is unwanted, unnecessary or unsavouryRainwater
MARINE
Fish stocksSuggests fish are here to serve usWild fish populations
Biofoul, foul, foul groundSuggests there is something ugly about biogenic habitatsSea life, seafloor life, benthic epifauna
FishingCasual everyday activityKilling native wildlife
Bait fishImplies the fish exist to be baitSmall schooling fish / forage fish / small pelagic fish / shoaling fish
Bait ballImplies the fish exist to be baitTight ball of fish
SeaweedsPest connotationsOcean plants
Mobile bottom contact fishingImplies a light touch(Mobile) Bottom impact fishing

The mussel line

Marine restoration is a lazy business. All you have to do is stop fishing an area and marine ecosystems heal themselves. However this is not the case with green-lipped mussels in New Zealand.

100’s of square kilometres of sub-tidal mussel beds were fished to extinction in each harbour around New Zealand.

The industry collapsed and more than half a century later they have not returned. In the Hauraki Gulf there are a few places you can still find Green-lipped mussels. You would think that these places would be deep under the ocean (Green-lipped mussels have been found at 50m deep), but they are not.

Most are in the intertidal zone on rocky shores. Here there is usually a gradient with mussels thin higher up and getting thicker towards the low tide mark where the abruptly stop. I have asked several local experts and no one has a solid answer why they stop so abruptly.

As we spend 100’s of thousands of dollars restoring sub-tidal beds maybe the key to unlocking a lazier (and cheaper) solution is staring us in the face. Here are some thoughts on why the line exists:

Mussel predation
  1. Avian predation. Every exposed mussel bed has at least one pair of Variable oystercatcher eating the smaller mussels every low tide. So living in the subtidal zone would be of some advantage but we are more likely to see juveniles higher up. This suggests there is even more predation from below.
  2. Starfish predation. Eleven-armed star fish were a big problem in the first beds put down by Revive our Gulf. However starfish do okay in the intertidal zone and are not particularly abundant in intertidal beds. We don’t see lots of them waiting below the low tide mark next to intertidal mussel reefs.
  3. Fish predation. This seems like the most obvious cause but surely it can’t be Snapper as they have been fished down to 20% of their natural biomass. Rays are a possibility but I thought I would see more of them in the shallows if this was the case. This southern study looked at predation and found it to be largely subtidal and nocturnal, by fish and large crabs.
  4. Octopus predation. They are nearly invisible and love eating mussels so at first this is a good fit. But octopus leave the shells, I will look for evidence on the next intertidal bed I explore. None of the predation theories show why the line is so strong.
  5. Food. Mussels eat phytoplankton and algae. I am sure there will be more of this close to the surface. I am pretty sure this is why mussel farmers grow their mussels high in the water column. Wild mussels might also benefit from wave action on the rocks as it would increase the oxygen in the water. However I would have thought that these benefits would be offset by the fact they can’t feed while they are exposed to the air.
  6. Sediment. If there is some other benefit to being exposed to the air maybe it’s that the water in the Gulf just has too much stuff in it. Mussels have to work hard sorting out the food from the dirt, maybe mussels are not good at taking a break and being forced to is good for them. Comparing the condition of intertidal and sub-tidal mussels would help dismiss this idea.

Or maybe like a lot of things in biology it’s a mixture of the above factors. As we are slowly losing our intertidal mussel beds it might be wise to set up a long-term monitoring project that might solve this mystery and inspire lazier restoration methods.

Probe holes

With a lot more mowing at Point England this Winter the flocks of South Island pied oystercatcher are leaving a visible sign in the paddock. The probe holes are very dense right to the edges of the paddocks which means that I am also seeing holes from other species like White-faced heron and Pukeko. Casual counts put the number of little holes at around 100 per square meter.

Roost heights at Tahuna Torea

Today I took photos of the shorebird roosts at Tahuna Torea when the tide had just covered the main spit roost. The tide height was predicted to be very high today (3 August 2019) at 3.4M Westhaven 8:53am. Conditions were relatively calm and the tide had not (yet) hit the high tide line on the Little spit.

Names and locations of roosts
The tide was about 20cm (in height) from the high tide line.
Pathway
And the pathway which is sometimes flooded was about 20cm above the water.
The spit had just been compromised with just a few Black-back gulls and one Caspian tern remaining
Little spit was about 50cm above the tide
Little Godwit Island was about 60cm above the tide
South Godwit Island was about 75cm above the tide
North Godwit Island had about 40cm above the tide
The Southern end of Lockley Island had about 40cm above the tide
The Northern end of Lockley Island had about 40cm above the tide

I have seen tidal debris in the mangroves around Lockley Island that were above the height of the island (indicating that it sometimes gets swamped) but I think it would have made an ok roost today.