Tyre Depth Gauge for Loose Powder XRF Preps

When we prepare samples for analysis by XRF we tend to use a simple pressed powder preparation method in the first instance. The depth of the sample is important for  calculating matrix density and in correcting for non-infinite depth samples (thin samples and/or those with a low average atomic number). We used to use a standard digital caliper, but it wasn’t ideal. The 150 mm calipers we had were unwieldy, and it care was needed to accurately level the depth gauge probe.

Erroneously small sample heights could be recorded using the depth gauge on a standard caliper (left). Using a digital tyre gauge (right) results in a more accurate measurement.

 

150 mm digital calipers (top) and a digital tyre depth gauge (bottom).

Enter the digital tyre depth gauge! These digital tyre depth gauges are easy to find online for a few quid. They have a nice wide guide, and as you can see with a range of up to 25 mm, they are perfect for measuring the depth in XRF loose-powder pots (we use pots that are nominally 22 mm tall). The method goes:

  1. Zero the display whilst measuring the depth of the empty pot.

    Measure from the top of the pot to the base, and zero the display.
  2. Fill the pot with sample, press and weigh.
  3. Measure to the top of the sample from the top of the pot.

    Having filled the pot with the sample, probe from the top of the pot to the sample surface.
  4. The height of the sample is the additive inverse of the reading (that is, -7.45 mm is 7.45 mm).

How The Itrax Core Scanner Works – New Poster

Having described the basic mode of operation of the Itrax core-scanner countless times in the past few months, I thought it time for a proper poster with nice diagrams, so I recently made this poster for the Itrax facility I’m looking after. It has a description of how x-ray beams are created and controlled in the scanner, the principles of measurement, and an explanation of how deriving chemical compositional data from fluorescence spectra can be done. It’s available from the resources page of this website. Let me know if you print one and use it – I love seeing photos of resources I’ve created being used!

If you need more information on the Itrax core scanner, there’s this paper and this edited volume that may be of interest.

The poster, printed and hung-up in the laboratory

Ffridd-y-Fawnog

I took a trip with our second-year students enrolled on Phil Hughes’ GEOG20351 “Glaciers” course to the Arenigs in North Wales. I delivered a short presentation on work I did as an undergraduate in 2008 on a nearby lacustrine sequence with an excellent chironomid temperature proxy record that seems to date to the late-glacial and early Holocene. I’ve uploaded the document here if anyone wants to take a closer look.

Cedar Pollen Size Paper

Ben Bell, a PhD candidate has just published his latest collaborative research on cedar pollen and climate variability. His research is focussed on ways in which Cedrus atlantica might be used as a proxy for (palaeo)climate in the Atlas. This paper examines a previously postulated link between pollen grain size and moisture availability, and concludes that moisture availability is not a significantly related to grain size in this context.

The study makes use of a number of methods for determining grain size – light microscopy, scanning electron microscopy, and laser granulometry. I was involved in the laser granulometry aspect, which Ben proves experimentally is comparable to the microscopic methods. Laser granulometry is considerably less time consuming than microscopic examination, so allowed for the large sample size used in this study.

The paper is published in Palynology, and is open access, available here.

 

A Shiny App for Very Simple Particle Size Diagrams

I knocked this app up for students who were struggling to draw diagrams for their reports on particle size. It’s my first app written and published with shiny, and I’m looking forward to using this interface for more of my code in the future.

How to Use This App:

  • Your data should be saved as a comma-separated-value, or *.csv file – NOT an Excel file. Remember to select this option in “Save as…” if you are using Excel.
  • Your data should have sieve sizes in rows, and sample names in columns. The pan should have a size of zero (0). Here’s an example to download and use as a template.
  • Your sieve sizes should be expressed in microns.
  • Visit https://tombishop.shinyapps.io/histogramR/.
  • If you’ve included multiple samples (in columns), select the column you’d like to plot.
  • If you want to use units of phi, rather than mm, toggle that setting.
  • To copy the image, just right-click it and save it.
  • If you need statistics calculating for your data, you could use Regis Gallon’s excellent G2Sd program – just remember to set the “sep” parameter to “,” and the “dec” parameter to “.”.
  • Remember to label your axes as appropriate.

Itrax Data Manipulation in R

I’ve been working on ways to make Itrax data more useful to casual users – I figured one way to do this would be to provide some kind of standard report for each scan (or core sequence), with a stratigraphic diagram, some zonation and multivariate analysis. I’ve decided to do this in R, as it is freely available, cross-platform, handles large datasets and has some existing packages that are useful in manipulating scanning XRF data. At present the functionality is very basic (a bit like my understanding of R). I’ve made the following functions available on my Github repository:

  • Import: A function for importing Itrax data into R and cleaning it up a bit on the way. Can also plot the data.
  • Ordination: Performs correspondence analysis, with various options for preparing the data. Also provides biplots.
  • Correlation: Generates correlation matrixes for Itrax data, and some visualisation.
  • Average: Averages Itrax data into a smaller dataset.

I’ll update as I add or modify functionality and documentation. I’m particularly interested to hear from others who are writing code for working with Itrax data, as I think it would make sense to collaborate and work towards a single, powerful suite of tools. Currently my plan is to begin to incorporate some of Menno Bloemsma’s methodology (parts of Itraxelerate) into R, whilst also working on a printable “standard” core data report that can be generated in batches from raw data.

PAST Counter Function

I’ve just discovered the very useful counter function in PAST. PAST is a statistical software package designed specifically for palaeontological data, and can do all sorts of tests and exploratory data processing. I’ve recently moved to version 3.14. One function I’ve just noticed is the counter – this enables you to input counts directly into a spreadsheet using the keys on your keyboard. It also provides auditory feedback and a total count. The software and instructions for its use are available from Øyvind Hammer’s website.