MLA CE: Advanced Searching Techniques and Advanced Strategy Design

Wow, what a rush! It’s already been a month since the MLA/CHLA/ABSC/ICLC Mosaic Conference wrapped up in Toronto and my head is still buzzing.

Besides getting to meet a whole host of incredibly cool and interesting medical librarians (more on that later, if I can muster the blogging muscle), I also had the privilege to attend one of the pre-conference continuing education sessions “Advanced Searching Techniques and Advanced Strategy Design” led by Julie Glanville and Carol Lefebvre.

If those names sound familiar to you, it’s because they are involved in all sorts of really interesting work in the systematic searching world. You may have seen their names in the Cochrane Handbook, on the ISSG Search Filters Resource website, or in CADTH’s PRESS Peer Review of Electronic Search Strategies: 2015 Guideline Explanation and Elaboration. It was kind of funny really, I had recognized their names on the CE materials, and frequently use ALL of these resources, but didn’t put it all together until the session itself (needless to say, I was pretty star struck when all the puzzle pieces came together).

The session was broken up into four main components: search term identification, identifying, choosing and using search filters, search strategy structure, and peer review of complex search strategies.

Search Term Identification

During this part of the session, we discussed what techniques we used to get a search rolling, and Glanville walked us through a number of really interesting resources, including the Yale MeSH Analyzer, PubReMiner, GoPubMed, MeSH on Demand, and Quetzal.

Text analysis tools for the win!

My key takeaway from this part of the course: using text analysis tools can really enhance the search at the creation stage.

Text analysis tools are fast and can give you an instant breakdown of how frequently keywords and MeSH headings are used in a set of target articles. Where before I would have gotten started on a search by identifying a number of target articles and manually scanning them for commonalities (and more recently using the Yale MeSH Analyzer to identify common MeSH terms), using tools like PubReMiner makes this process faster, and also leads to the use of terms and headings that one may not otherwise have picked up on, which will (hopefully) enhance the precision of searches.

Search Filters

In this session, Lefebvre described the processes in which search filters have been designed, gave a brief background on the ISSG Search Filters Resource, and took the group through methods for appraising search filters.

Search filters: to cite or not to cite?

We got into a really interesting discussion as a group when one of the attendees asked our facilitators whether librarians should be citing the use of search filters when they write up their methods.

Lefebvre was quite adamant that all search filters should be cited, published or not, modified or not. She asserted that if modified, the filter should still be cited, but a description of modifications should also be included.

For her part, Glanville felt that modified search filters were essentially new creations and it might be misleading to cite a validated filter (since modification would render validation invalid).

This was a really important conversation to have as a group since I will (somewhat abashedly) admit that though I take painstaking notes of whose searches helped inform my own, it doesn’t always occur to me to cite them when I write up my search methods. Lesson learned!

Search Strategy Structure

What do you do when a search question doesn’t fit into a tidy PICO breakdown? During this session of the day, Glanville walked the group through other search mnemonics such as SPICE (qualitative), ECLIPSE (management) and variations on PICO such as PICOT-D. We also spent some time as a group developing searches that are broken down into concepts, but which don’t fit into tidy search mnemonics. As Glanville pointed out, this strategy can be useful when searching topics in public health, epidemiology, adverse effects and quality of life.

Peer Review of Complex Search Strategies

During this final session of the day, Lefebvre discussed search strategy peer review with the group, and took us through the Peer Review of Complex Search Strategies or PRESS project and its 2015 update. We also spent some time individually assessing a search strategy with errors embedded throughout and went over the search as a group to discuss various approaches and possible improvements.

Final Thoughts

All in all, this was a very useful course, and I have already attempted incorporating text analysis tools in my search strategy formulation (saving time and identifying search terms that wouldn’t normally have been on my radar in the process).

I would say that our session was excellent for two main reasons:

  • Julie Glanville and Carol Lefebvre are incredibly knowledgeable and experienced. They are a veritable wealth of information and are both searching powerhouses.

  • Thanks to our small class size and the way in which Glanville and Lefebvre organized the session, we also got to learn from our colleagues quite a bit in small and large group discussions – you wouldn’t believe the collective brain power we had in the room that day!

Many thanks to the Glanville and Lefebvre for their enthusiasm and knowledge, to my fellow attendees for sharing their own excellent ideas and thoughts with the group, and to my workplace, the Maritime SPOR SUPPORT Unit for sponsoring my attendance at this session!

Previous
Previous

Organizing your Systematic Review: Project Notes Template for Librarians

Next
Next

Medical Librarian Blogs