MoniQa 2nd conference on food fraud and allergen managemen report, by Michael Walker (UK, Government Chemist)

Government Chemist contributes to international conference on food fraud and allergen management, Vienna 7th & 8th June 2018

(by Michael Walker)

Michael Walker represented the Government Chemist at the second MoniQA conference on food fraud and allergen management, in Vienna 7th and 8th June 2018. Michael attended a MoniQA allergen task group meeting, gave a talk entitled ‘Honey authenticity: when official controls are questioned’, and took part in a panel discussion on food authenticity. He kindly drafted the present report and allowed me to make it public

The conference attracted 77 delegates representing all aspects of the agrifood sector and was organised by Roland Poms, Secretary General, MoniQA Association, Austria, and his team along with Richard Cantrill, President, MoniQA Association, Canada. Roland and Richard welcomed the delegates and introduced the conference which was an exciting programme of talks mixing cutting edge science with examples of litigation and enforcement.  Strategies and methods for detecting and combating food fraud were presented as well as discussion of food allergy and coeliac condition, including precautionary allergen labelling, ‘free from…’, thresholds, laboratory accuracy and class action litigation.

The speakers represented the United Nation’s FAO/IAEA, IFS, LGC, USP, MoniQA, as well as industry including Nestlé, SQTS, Imprint Analytics, law firms and food research institutions, regulators, academics and nongovernmental organizations.

The book of abstracts is published as volume 10, supplement 1, in the journal ‘Quality Assurance and Safety of crops and foods, www.wageningenacademic.com/foodfraud2018.

The keynote speaker was Andrew Cannavan, Laboratory Head, Joint FAO/IAEA Division of Nuclear Techniques in Food and Agriculture, International Atomic Energy Agency, United Nations, Austria Andrew discussed the global perspective of food fraud and a ‘systems’approach to dealing with it. His examples were drawn from his Division’s international research projects and capacity building in the developing world. The focus of the projects is mainly on the development and application of analytical methods for food authenticity to underpin traceability, increase confidence that food commodities reaching local consumers, and those destined for international trade, are safe and authentic. Andrew’s conclusions included that positive criteria for enforcement leads to good analytical methods, and he advocated using the simplest methods that are fit for purpose and validated alongside advanced confirmatory techniques.

John O’Brian, Deputy Head, Nestlé Research Centre, Switzerland, gave some key insights on food authenticity from an industry perspective. He highlighted challenges such as ‘free from’, ‘natural’, alternative proteins, new technologies, lab grown meat, ‘grass-fed’, and packaging reduction. All stakeholders need to recognise many of these are pre-competitive issues.  Systemic vulnerability analysis, targeted audits, and early warning tools are now in use in both the private and public sectors to detect and to manage food fraud risks. Some facilities are employing sophisticated analytical tools to distinguish between ‘normal’ and ‘abnormal’ fingerprints which enable follow up action on a targeted basis. Such approaches have been shown to assist greatly in early management of issues affording greater consumer protection. However, there remain gaps in global coverage and in consistency of use of such tools.

Building on this Thomas Gude, Deputy Head, SQTS, Switzerland, discussed the value of food profiling to assure food authenticity rather than to investigate for food fraud. Food profiling is a systematic approach applying targeted and non-targeted analysis with  modern high-resolution mass spectrometry. This is a promising area that needs much more work and international collaboration. Thomas illustrated his concepts with food contact materials and where analysis is for known targets (additives) non-targets, NIAS – non-intentionally added substances.

Steven Gendel, Senior Science Manager – Food, United States Pharmacopeia (USP), speaking on assuring food authenticity from a standards developer’s perspective  noted that assuring food authenticity is a complex problem that includes the need for information integrity and communication. Consumers expect foods to contain all of the constituents that should be present, to not contain anything inappropriate or harmful, and to have accurate and complete labels. These same expectations apply to all the participants in the supply chain. For industry, authenticity starts with the ingredient. Thus ingredient standards and reference materials are needed along with identity procedures, acceptance criteria, and technically rigorous methods. Steven described with examples how these requirements are met by the Food Chemicals Codex. His final message was that future standards will need to address complex ingredients where composition can vary depending on environmental, agricultural, and other unpredictable factors. This will create a need to share information such as sets of spectra that are not amenable to publication as traditional documents. These changes mean that standards development organizations, and the food industry, will need to develop a new understanding of what constitutes an ingredient standard, of how to ensure data integrity, and of how to communicate and use these new standards along the supply chain.

Michael Walker, Laboratory of the Government Chemist, UK, discussed honey authenticity, in particular when official controls are questioned. Michael described a series of food fraud cases involving honey in the U.S., China, New Zealand and Denmark. There are many means of adulterating honey, including the addition of cheap sugars and syrups after collection from hives, overfeeding bees with saccharides or invert derivatives and the falsification of the floral or geographical origin. In the face of media reports that more mānuka honey is sold worldwide than is produced in New Zealand a set of high level characteristics for mānuka-type honey were developed which Michael reviewed and critiqued. He described the European Commission control plan on honey in which over 2,000 samples of honey were collected, some 20% of which were non-compliant with authenticity criteria. Honey authentication requires a multifaceted approach which can be costly and time consuming. As well as classical analysis δ13C EA/LC-IRMS is required. But is this sufficient? Nuclear Magnetic Resonance (NMR) can provide quantitative data and molecular structural information on key components in honey with little sample preparation and over the last five years a small but significant literature has emerged on this approach. However, in 2015 the UK Food Standards Agency wrote to UK enforcement authorities to state ‘… 1H NMR …screening method gives indicative results and does not definitively prove that added sugar is present … no enforcement action should be taken in relation to the NMR results alone with regards to added sugar at the present time …’. The reasons for this were examined along with the outcomes of the 2018 European Commission Joint Research Centre Technical Round Table on Honey Authentication.

Cesare Varallo, Vice President – Business and Regulatory Affairs EU, INSCATECH, Italy, gave a lawyer’s perspective on how technologies and blockchain could be used to mitigate the risk of lawsuits and recalls. His presentation examined some of the most recent developments and application of artificial intelligence, the internet of things and blockchain in the food supply chain. These technologies must be properly understood, before being applied: benefits and vulnerabilities should be carefully considered. For example, systems’ interoperability cannot be ignored but such technologies can contribute to strengthening the supply chain and the quality of the data especially in the face of litigation.

Litigation was again the theme taken up by Riëtte van Laack, Director, Hyman, Phelps, & McNamara, P.C., USA. With a PhD in food science and law qualifications Riëtte was well placed to discuss the U.S. response to food fraud: FSMA, litigation, and the national organic program. Although the U.S. Food and Drug Administration’s regulations implementing the Food Safety and Modernization Act specifically mention food fraud, they address food fraud only to the extent that the fraud is a safety concern. Compositional standards help prevent and combat fraud to the extent that such standards can be verified. However, frequently, standards cannot be verified by testing alone. In the United States, private litigation by competitors and consumers is used frequently to combat alleged fraud. Competitors know the market and more easily recognize circumstances of possible fraud. Consumer protection organizations can highlight potential fraud situations. In addition, many state laws provide for monetary recovery in consumer class actions for food fraud. Examples of food fraud litigation include lawsuits regarding pomegranate products and cases regarding extra virgin olive oil. In some cases, private litigation has resulted in the development of standards, either by independent third parties or by federal agencies.

Riëtte went on in a second talk to discuss the failure of analytical tests. Fraud is often committed with the knowledge of what companies generally test for. For analytical testing to be useful in detecting fraud, at least three requirements must be met: (1) A clearly defined standard related to the chemical composition of the authentic product has been established; (2) There is a known compositional difference between the authentic and fraudulent food; and (3) The analytical test must be validated. Riëtte illustrated the failure of testing in litigation by the New York Attorney General against retailers that were accused of selling herbal dietary supplements allegedly not containing what they were represented to contain. DNA testing of these supplements allegedly revealed absence of the labelled botanical substance. However, investigators had failed to consider that the extraction processes applied to botanicals could have removed or destroyed genetic material.

Day 1 of the conference concluded with a panel discussion on interpreting legal limits in the context of laboratory data. Moderated by Richard Cantrill, Cesare Varallo, John O’Brian, Michael Walker, Riëtte van Laack, Thomas Gude and Steven Gendel discussed the value of data in litigation and disputes with lively input from the audience.

Day two of the conference began with Beatriz Torres Carrió, Senior Quality Assurance Manager, International Featured Standards (IFS), Germany, who gave an informative talk on   IFS strategies of quality management against food fraud and new IFS certification. Much of the IFS documentation is freely available on their website.

David Psomiadis, Head of Laboratory, Imprint Analytics, Austria, then discussed forensic science and digital techniques in Food authenticity and traceability testing. He illustrated analytical tools, good practice and future trends with several interesting examples. These included isotope analysis (LC/EA-IRMS) of ‘coconut water’ for added sugar, detection of synthetic vanillin, and fruit juice origin.

Steven Gendel returned to the podium to talk on opportunities and challenges in targeted vs non-targeted methods. Unlike microbial hazards, chemical hazards in foods cannot usually be controlled by processing. This means that control of chemical hazards in a prevention-based food safety system is focused on the supply chain. Dilution is a poor solution and may be illegal. The food industry is turning to the use of non-targeted methods to characterize foods or ingredients. The advantage of well-designed non-targeted methods is that they can indicate whether a particular sample is ‘out of range’ without needing to know why. The disadvantage of these methods is that they require a great deal of data to determine the expected ‘range’ under a variety of growing, harvesting, and handling practices. In addition, each sample test can generate significantly more data than presence-absence or threshold-based tests. The volumes of data involved and the need for standards on how to generate and use these data present unique challenges and opportunities for standard development organizations and for food manufacturers. Steven then led a discussion on the acceptability of untargeted analysis and the acceptance of the results by industry, regulators and standard setters.

Roland Poms, Secretary General, MoniQA Association, Austria, then described method validation and reference materials to assure the reliability of analytical results. There is always a need for laboratories to be able to demonstrate that their methods are fit for purpose in their hands, give equivalent results to the reference method and can be viewed with confidence by the customer. The requirements for the quality of an analytical method are best assessed in a validation study that usually involves some 8-16 laboratories to offer at least 8 valid results for statistical analysis. Parameters assessed include limit of detection, limit of quantitation, repeatability and reproducibility (variability of results within and between laboratories), accuracy, specificity, and false positives or negatives.  Additional information that can be drawn from a validation study concerns robustness, the acceptability and the handling of the method in the hands of different operators, and the possible to be identified influences on the results in a routine setting. Necessary steps towards assuring the reliability of analytical results in any laboratory are the preferred use of validated methods, the use of reference materials if available, method verification and participation in proficiency tests, training, considering the requirements for laboratory accreditation and following Good Laboratory Practice. Reliable analytical results are the basis for appropriate decision-making processes concerning product safety and adequate food safety management measures.

Samuel Godefroy, Professor of Food Risk Analysis and Regulatory Systems, Food Security, Laval University, Canada described the  Association of Official Analytical Communities (AOAC INTERNATIONAL) standard method performance requirements (SMPR) for allergen detection methods. Samuel described how analytical methods are part of the Codex regulatory provision – they don’t exist in isolation but to support a Codex standard. SMPRs for food allergen methods using ELISA-based techniques describe the minimum recommended performance characteristics to be used during the evaluation of a method. SMPRs for egg and milk are available and those for tree nuts will be relying upon the ability of the food allergen community to develop agreed-upon reference materials for the selected tree-nut(s) of priority.

Karin Hoffmann-Sommergruber, Group Leader, Department of Pathophysiology and Allergy Research, Medical University of Vienna, Austria, spoke on precautionary allergen labelling. Karin described the majority of allergic reactions involve skin, respiratory, gastrointestinal and systemic (anaphylaxis) effects respectively. Legislative and Precautionary Allergen Labelling, PAL, (which is voluntary) were described with the disadvantages of the latter explored. Probabilistic allergen risk management was discussed along with thresholds of elicitation. Unfortunately allergic consumers often assume PAL is regulated. Karin described how PAL is approached by various countries and recommended PAL should only be used if cross contamination is unavoidable and represents a real risk.

Clare Mills, Professor of Biological Sciences, Division of Infection, Immunity and Respiratory Medicine, University of Manchester, UK, delivered her talk, ‘Free-from foods – what does it mean for allergens?’ via Skype, which worked well. There is currently no consensus as to what constitutes a ‘free-from’ food with regards IgE-mediated food allergies. Regulators and food manufacturers alike have to rely on analytical testing to demonstrate the absence of an allergen in a food product. The lack of agreed reference doses which are considered safe for the majority of allergic consumers means it is unclear how sensitive test methods need to be, although dose distribution modelling can provide guidance with regards the levels of allergens that are unlikely to cause a reaction.  Inter-laboratory comparisons of immunoassay test methods for foods such as milk, egg and peanut have shown wide variations in test method performance regarding sensitivity and reproducibility of results. This leaves the possibility that manufacturers and enforcement bodies may obtain conflicting test results. The development of appropriate certified reference materials for allergen analysis can be used to help reconcile some of these differences, especially for test methods showing reproducible and consistent differences. Mass spectrometry (MS) methods have much to offer as a complementary, confirmatory method to the currently favoured immunoassay test methods. Clare described the iFAAM multi-centre study using a MS method for determination of peanut in chocolate dessert. The trial showed significant divergence in the ability of ELISA tests to quantify peanut allergens and demonstrated that MS has the potential to detect and quantify peanut protein at similar levels to ELISA. Identified gaps will be taken forward in the recently EFSA-funded project ThRAll.

Katherina Scherf, Research Group Leader, Leibniz-LSB, Technical University Munich, Germany, followed with a well thought out presentation on improved reference materials for gluten analysis.  Katherina described the standards for gluten-free products and the difficulties of gluten analysis. Well-characterized reference materials are essential to help address these challenges and Katherina summarised a considerable amount of work carried out by an international consortium of which she is a leading participant. This has resulted in identification of wheat cultivars that are representative for the multitude of wheats grown worldwide. Selection criteria for representative wheat cultivars as basis for the development of a new reference material for gluten(-free) analysis were defined. Grains of wheat cultivars from different geographical origins were collected, milled into white flours and characterized for chemical composition, wet and dry gluten content, ELISA response using two different antibodies and protein composition assessed by gel-permeation and reversed-phase high performance liquid chromatography and polyacrylamide gel electrophoresis. Based on the results, qualitative and quantitative selection criteria were defined and five wheat cultivars from four continents were selected. These cultivars were further investigated, and two reference materials are suitable: the single cultivar Carberry and the blend of the five cultivars.

Rene Crevel, Director at René Crevel Consulting Limited, UK, formerly of Unilever, gave a cogent overview of   new tools for allergen risk assessment and management arising from the the iFAAM project (Integrated Approaches to Food Allergy and Allergen risk Management).  The project team developed an allergen tracking tool, together with a tiered risk assessment approach. The tools and their application were thoroughly described. Tier 1 is based on point estimates and is designed to be used by those without deep expertise in the allergen field, but they do need sufficient knowledge of their own processes. The RA is conservative hence not likely to make an unwise assessment. Tier 2 RA is built on previous models, and uses distributions of minimum eliciting doses, food consumption data and unintended allergen concentrations  combined with advanced statistics and modelling. Tier 1 will be freely available. Tier 2 will not be publicly available in part owing to the level of expertise required to use it. René also described the iFAAM risk management toolbox, an extension to the tracking tool. RM options were collected and collated and a decision tree approach developed. iFAAM finished in 2017 and the tools will be available on the website. Currently iFAAM are looking at a governance structure for the website dissemination of the tools to ensure continuing relevance. The information can also be accessed via the MoniQA website.

Bert Popping, Managing Director and Co-owner, FOCOS, Germany then discussed the benefits and challenges of consumer analytical devices. Against a backdrop of growing distrust of food in some quarters can point of use devices help, especially for people with food allergies? Available, and affordable, in the U.S. they have given rise to a number of issues. These include sampling, misinterpretation of instructions and outputs and specificity. Bert described 2018.AOAC stakeholders’ guidance document for consumer analytical devices with a focus on gluten and food allergens published in Journal of AOAC International101(1), pp.185-189.

Ronald Niemeijer, Director Global Marketing Food & Feed Analysis, R-Biopharm, Germany, next discussed socio-economical aspects of food allergens. Ronald reviewed the burdens on people with allergy and autoimmune conditions and their prevalence. He discussed the provision of ‘free-from’ food on their behalf. ‘Free-from’ is a growing business opportunity but means increased production costs, logistic separation of ingredients, dedicated facilities or lines, or much stricter cleaning. Management and testing costs are higher. Ronald concluded with a review of current and potential future tools in allergen management and food allergen testing.

Adrian Rogers, Senior Research Scientist, Romer Labs UK, UK, discussed allergen analysis and three case studies which demonstrated how particular challenges faced by different users were overcome to make sure that immunoassay-based testing for food allergens best suited their needs. These were (1) an intensive validation programme of an ELISA for peanut in confectionery after the kit the customer had previously been using was withdrawn, (2) a study of casein determination in hydrolysed whey based baby formula milk and (3) findings of almond and peanut in a spice/seasoning sample where different kits gave different results.

Two awards for best poster at the conference were made with the researchers giving short talks on their work. Patricia  Galan-Malo presented ‘Survey on the occurrence of allergens on food-contact surfaces from school canteen kitchens’ and Andreas Zitek presented ‘Determination of origin and authenticity of fish by elemental and Sr-isotopic fingerprints’.

Finally Richard Cantrill and Roland Poms gave feedback on the two MoniAQ Task Force workshops on (a)  Food Fraud Prevention and Authenticity, and (b) on Food Allergen Reference Materials. The next task force meetings will be in Toronto at an AOAC allergen meeting in August 2019. The next MoniQA conference will occur in latter half of 2019.

 

MoniQA Conference Report: Food Fraud Prevention and Effective Food Allergen Management (Bari, 26 – 27 Jan 2017)

Notes of meeting by Michael Walker, LGC, UK (see bio at the end of the article).

This was a two part meeting with day one on food fraud and day two on food allergen management.

Food Fraud

In the opening lecture of the conference Michael Walker in a talk ‘Food detectives: what it takes to trace food fraud’ described the historical cycle of meat species fraud and as an expert member of Professor Chris Elliott’s UK review, the spectrum from ‘cutting corners’ to food crime. Michael summarised the Elliott Review findings – the ‘8 pillars of food integrity’, dealing particularly with the National Food crime Unit, NFCU, the Virtual Authenticity Network and intelligence sharing. Michael concluded with case studies based on referee cases and conclusions. These included advice on curation of datasets, intelligence sharing and adequate levels of analytical surveillance for food fraud.

Cesare Varallo, a food lawyer and regulatory specialist, gave a persuasive talk on how food companies should handle a food recall – starting with planning for what you hope will never happen. Decide (in advance) who will do what if a recall needs to happen and have an effective communication strategy. Consumer safety, avoidance of panic, and brand protection can be achieved, Cesare noted especially not to neglect social media.

Jeff Moore, Director, Science-Food Standards, United States Pharmacopeia, USP, explained the background to USP’s involvement in food. Jeff explained the USP’s current (free to download and comment on) validation of Non-Targeted Methods for Food Fraud Detection. Non-targeted methods work by, instead of looking for what should not be there but by defining carefully what should be there. Any deviation in the expected analytical signal flags up suspicions that can be further investigated. This is a powerful approach to check for the ‘unknown unknowns’ but so far has been difficult to validate to accepted chemistry standards so this USP advice is very welcome. The USP food fraud mitigation strategy and food fraud database are key resources. Its impossible for every food firm to investigate every ingredient – USP guidance shows how to funnel down risky ingredients so as to prioritise vulnerability assessments. The Food Fraud Database can give real time information on designated ingredients and record types of food fraud, incidence records, inference records, surveillance records, and methods. Find out more here.

Jingyi Li Blank, Director, Mintz Group, Hong Kong gave a fascinating glimpse into the highly restricted world of private investigations in food. Her examples included background checks on people, information related to acquisitions, disputes, IP violation, post-fraud internal investigations, and corruption and bribery investigations. All these are handled in a highly restricted and confidential manner. The information is derived by wholly legal means from public records, (which can be broad, e.g. personal attendance at court hearings), and human intelligence inquiries. For example, in China, understanding the true ownership and management structure, the local government dynamics, or whether the company or its principals have ever been subject to corruption or litigation issues, have been key to helping clients understand the profile of an important partner.

Alain Maquet, of the Joint Research Centre (JRC), European Commission, gave an overview of targeted single/multiple marker approaches (IRMS) and fingerprint type of methods,( -omics and IR). Alain showed organic wheat can be differentiated from conventionally grown wheat by transcriptomics although more samples will be needed to enhance the predictive ability of the technique. Untargeted metabolomics were successful in classifying organic v’s conventionally grown carrots.

Marco Arlorio, Università del Piemonte Orientale “A. Avogadro”, Italy, made the point in a talk on post-analysis data processing of untargeted methods that untargeted = big data – so what post analytical steps need taken? His solutions included data fusion by mathematical algorithms better to describe the phenomenon, along with ‘deep analytics’, and artificial intelligence – artificial neural networks, although these must be trained by large numbers of samples. Once again the problems of lack of reference standards raised its head.

Bert Popping, formerly Eurofins, now teamed with Carmen Diaz-Amigo in their own consultancy firm compared the food frauds uncovered by Fredrick Accum (‘Death in the pot’) 200 years ago with modern frauds, many in the same product categories such as olive oil. He emphasised considerations of mass balance in detecting fraud and looked at the information to be derived from consumer product buying patterns and, without using the term, discussed ‘the consumer as analyst’ affirming the findings of our own horizon scanning in this regard.

Richard Cantrill, Chief Science Officer, American Oil Chemists’ Society (AOCS), gave a good generic answer to the question ‘What is your favourite olive oil? – “one I can taste before I buy”. If you can’t rely on that, exacting olive oil specifications have been developed together with elaborate analytical approaches. The International Olive Council has devised approaches to verify authenticity by relying on multiple approaches to oil chemistry designed to limit the possibilities of adulteration with other common seed oils.

Professor Tullia Gallina Toschi, University of Bologna, Coordinator of the OLEUM Project described its aims to better guarantee olive oil quality and authenticity by empowering the detection and fostering the prevention of olive oil fraud. Over four years, the project will develop, with the widest range of stakeholders, new methods and carefully revise existing analytical methods for detection of frauds in different categories of olive oil. It will establish a wide community of laboratories and institutions involved in quality control (OLEUM Network) and by storing and sharing relevant data and results (OLEUM Databank). OLEUM has identified gaps – in legislation and regulatory, analytical approaches, harmonisation and coordination, and consumer and market confidence – and wants to address these and enhance competitiveness of European olive oil sector. But there are many global control labs using all the existing methods so it is best to adapt what we have and improve it to make existing methods more effective and efficient. OLEUM is still in its first few months.

Michal Godula, Thermo Fisher, Italy, described resolving food authenticity challenges using advanced isotopic ratio analysis using Orbitrap high resolution mass spectrometry.

David Psomiadis, Imprint Analytics, Austria, described developments in food authenticity testing by stable isotope analysis, a service provided by Imprint Analytics.

Valeria Terzi, Genomics Research Centre, CREA-GPG, Fiorenzuola d’Adda, Italy, discussed DNA markers, digital PCR and NGS to secure authenticity (e.g. Durum wheat) and safety (e.g. mycotoxins) in the small grain cereals sector.

Kezban Candogan, Department of Food Engineering, Ankara University, Turkey, spoke on the species identification of raw meat mixtures Using Fourier Transform Infrared Spectroscopy.

Food Allergen Management

Professor Clare Mills, University of Manchester, Coordinator of the iFAAM Project, UK opened the second day of the MoniQA International Symposium in Bari. In Clare’s wide ranging review of the development of risk assessment in food allergy the highlight for me was the new data generated in iFAAM broadening the evidence base to support threshold doses and evidence-based action levels for allergens in foods. This has the potential to yield a risk assessment. Secondly Clare re-emphasised the need for better analytical methods and reference materials for food allergens.

Professor Samuel Godefroy, (Laval University, Canada, and Queen’s University Belfast) spoke on risk assessment and the regulatory perspective, echoing calls for better analysis. Prof. Godefroy welcomed application of deterministic risk assessments for incident evaluation but also the moves within iFAAM towards a more probabilistic risk assessment which will be of wider application in better management (and reduction of) precautionary allergen labelling. But he cautioned that harmonised international guidance is needed to make a level playing field for regulators and a more predictable environment for industry.

Professor Luigi Macchia, University of Bari, gave a fascinating talk on the local clinical food allergy prevalence in the Puglia region.

Prof Clare Mills presented data from Dr Audrey Dunn Galvin of Cork on the iFAAM labelling survey to help us to understand more about how those living with food allergy assess risk when making decisions based on precautionary allergen (“may contain”) labelling. These data will be key in creating conditions in which food labelling is truly useful for, and more importantly, used by consumers with food allergies.

Luca Bucchini, Hylobates, Italy, gave a detailed analysis of the iFAAM database of food allergen recalls, calling for more harmonisation of how regulators deal with and report recalls.

Robin Sherlock, DTS FACTA, Australia, walked us through the Allergen Bureau VITAL 2.0, showing how industry benefits from food allergen management support tools. Robin, who was an Allergen Bureau Board member from the start of this excellent programme noted that analysis plays only a small part of the spectrum of Allergen Bureau allergen management and risk assessment tools. But the benefits of good analysis include monitoring incoming ingredients, investigating, validating and verifying cleaning and problem solving and troubleshooting, especially with any problematic equipment on the manufacturing line.

Daniel Imhof, Head Official Food Control Authority Brunnen, Switzerland explained how Switzerland set a value of 1 g/kg (0.1 %) which triggers allergen labelling, even if the allergen is not intentionally added but has inadvertently entered the finished product. The legislation has not set a limit, but a ‘mark’ by which the food industry should be aware of unintentional contamination in the context of their own product control and allergen management. In 2006 Allergie Suisse SA also defined a maximum value (50 mg/kg) for allergens, as an analytical detection limit and assess and publish complying products. Of the approximately 400 certified products so far, this value was not been exceeded. Hence the 1 g/kg (0.1 %), set up 20 years ago, is in effect superseded since the food industry seems to be operating at in-house limits much below this and below the 50 mg/kg Allergie Suisse limit.

My own talk, immediately after lunch seemed to go down well, and I hope my recommendations on how to report allergen results are well received and put into practice. The take away message was:

Reporting the results of allergen analysis;

A report should state:

• Method of analysis

• Result: [X] mg/kg as Y, – where [X] is the best estimate of the concentration of allergen found by analysis of the sample received after in-laboratory homogenisation, extraction and analysis by a validated method, and – Y is EITHER the allergen protein OR the name of the food.

• But if the whole food is the reporting basis the conversion factor from allergen protein to whole food must be given.

• Conversion factors should be agreed with literature references to the typical protein contents of (at least) Annex II allergens. Adding the N to protein factor would be useful.

• As a matter of routine the basis of data as allergen or (preferably) allergen protein should be specified every time a datum is given in a method or report. ….. Until it appears to be ad nauseam Walker M & Rogers A, Romer Academy 2016

I also put in a slide asking for expressions of interest about a Marie Curie fellowship at LGC on allergen protein measurement by exact matching IDMS.

Prof. Melanie Downs, – Food Allergy Research & Resource Program, FARRP University of Nebraska-Lincoln, walked us through the FARRP development and validation of effective food allergen control plans.

Dr. Linda Monaci, ISPA-CNR, Italy, gave a comprehensive and technically thorough review of the current literature and advances made in methods for food allergen analysis. Linda especially focused on tandem MS methods capable of detecting multiple allergens at the same time within a single run.

Ronald Niemeijer, R-Biopharm, Germany reviewed rapid methods in food allergen management.

Dr. Roland Poms, Secretary General/CEO, MoniQA discussed food allergen reference materials and their impact on more reliable analytical results. MoniQA has liaised with the EU funded project iFAAM, the Prolamin Working Group and Australia’s Vital concept group. The first validated Reference Materials for Food Allergen Analysis were presented at the latest AOAC Annual Meeting 2016 in Dallas, Texas, USA. The first set of materials includes testing materials for milk allergen analysis comprising a Positive Control (SMP-MQA 092014, characterized dried skim milk powder), Negative Control (BLANK-MQA 082015, based on a gluten free cookie), and two Incurred Materials: LOW-MQA 092016 (SMP incurred in gluten free cookies, milled, concentration approx. 5 ppm) and HIGH-MQA 082016 (SMP incurred in gluten free cookies, milled, concentration approx. 50 ppm). A new gluten material is in preparation which is based on an extensively characterized wheat flour able to be employed for protein, peptide, DNA, and potentially other markers to be analysed. The reference material for gluten-free analysis will be available as incurred materials at concentrations of 0, 5, 10 and 100 mg/kg gluten in a baked product. Other food allergen reference materials (egg and soy) are in preparation.

Dr Katharina Scherf, German Research Centre for Food Chemistry, Leibniz Institute, Germany gave an overview of methods for gluten analysis, including the benefits of current methods and remaining challenges for this very complex analyte. There is as yet no certified reference material for gluten, although there is the Prolamine Working Group material. Differing ELISA platforms can yield differing results probably owing to variable prolamin/glutelin ratios. A GP-LC-FLD was developed that appeared to perform well and a stable isotope dilution method was used to quantify the immunodominant 33-mer gluten peptide.

The talk by Dr Maria De Angelis, University of Bari, Italy, on meeting the needs of celiac and gluten sensitive consumers was delivered by Dr Fabio Minervini. Sourdough fermentation, a traditional biotechnology for making leavened baked goods, was recently rediscovered by the scientific community, consumers and producers. In sourdough bread acidification, proteolysis, activation of endogenous enzymes and fermentation takes place. Based on this De Angelis’ research group work included fungal proteases and selected lactic acid bacteria in a strategy for spray dried wheat subject to sourdough fermentation could meet the needs of celiac and gluten-sensitive consumers. It could also improve the intestinal microbiota and host immunity increasing the resilience to CD.

Lastly Kathryn Miller, Coeliac UK, talked us through the crossed grain symbol and the certification of gluten free products.

By Michael Walker, Referee Analyst in the Laboratory of the Government Chemist LGC United Kingdom

A published scientist Michael is also comfortable in strategy and policy. His current portfolio includes several strands:

o Referee Analyst in the Laboratory of the Government Chemist, LGC, where he also manages research on allergen measurement, o Chair of the FSA NI Strategic Committee on Food Surveillance,

o A thriving chemico-legal practice,

o Member of the IFST Science Committee

o Training officer for the Association of Public Analysts.

Michael was a subject matter expert to the UK DH/Defra Elliott Review in the aftermath of the horse meat scandal and continues to advise government on aspects of the Review. Michael formed his own consultancy in 2006 following extensive experience as a partner in a private laboratory practice, Public Analyst, non-executive Director and in Forensic Science Northern Ireland. Michael is a member of the European Academy of Allergy & Clinical Immunology and facilitated a Food Allergy & Food Intolerance Knowledge Network for 5 years to 2015. His non-executive director experience includes as a founder board member of the FSA. He is a Chartered Chemist and Fellow of both IFST and RSC, and holds the MChemA, the statutory qualification to act as a Public Analyst in the UK.