One of the more crucial parts of our job at Pickr as technology journalists is the need to recreate the same-same experience between test cycles.
As such, Pickr has specific methodologies for various products as detailed on this page, because while an experience is always going to be subjective, it is also backed up by the same approaches to indicate that reviews and analysis aren’t just a dice throw in the dark, and also aren’t just a casual first impression (except where otherwise indicated).
It does need to be noted, however, that the methodology found in this section is only applicable to reviews published by Pickr. Websites tracked by Pickr’s tracking system are subject to their own methodologies.
Experience
Led by Leigh :) Stark’s considerable 15+ years of experience reviewing across the Australian technology journalism sector, Pickr ensures its writers have experience and trust in the categories they work in.
Any reviewer publishing at Pickr has either reviewed elsewhere or has been worked with to establish a reliable and trustworthy approach to reviewing.
Phone testing
Phones and smartphones can take anywhere between two days and six days to review, but usually take closer to three or four.
Typically, a smartphone review is based around the general use-case scenario all phones are made for: day-to-day use.
Because we can’t simulate what a day is exactly, Pickr’s test team takes it through the course of their day.
This means the phone is charged throughout the night and taken off charge at 6am, with the course of the day tested with:
- streaming audio
- web surfing and email browsing
- picture taking, and
- even a few phone calls here and there.
The phone is also setup and connected to no less than two push-based email accounts, allowing work and home to be accounted for in the email world, and testing how much strain the push service has on the battery.
Connection tests are also performed for the handset, allowing speed tests to not just simulate a strain on the device, but also demonstrate actual real world speeds that are attainable. Unless otherwise noted, all speed is tested across suburbs in Sydney on the Telstra 5G network.
Audio is handled with a wireless pair of headphones or earphones, imitating the regular use of phones by people today, with the battery pushed to its limits through that regular day, which shows us what the battery will do under normal situations.
A benchmarking application is used on every phone, as well, making it possible to see how much performance the processor and graphics chip can deliver on the device, especially in comparison to other devices.
Headphones & speaker testing
Audio products such as headphones, earphones, in-ear monitors, speakers, soundbars, and generally anything else that transmits sound has to be reviewed based on a combination of technical description and subjective listening.
Reviewing always has the downside of being totally subjective, though Pickr’s reviewer likes to think an open program of audio helps to make the process a little more transparent.
As such, Pickr’s audio reviews are handled by a sound test that you can listen to yourself using either Apple Music, Google Play Music, or Spotify.
The playlist we use is one stored in lossless audio, much of which is 24-bit, allowing the reviewer to hear the best source of sound that is possible, whether or not you agree with the virtues of 24-bit high-resolution audio.
Soundbar testing
Like headphones and speakers, soundbars can be quite subjective, but we try to keep our reviews trustworthy using a specific test scene methodology that can be recreated. It has varied over the years, and can include some kids cartoons here and there, but with current uses of spatial audio via Dolby Atmos and DTS:X, we test using several minutes of scenes from the below timecodes:
- Spider-Man Across the Spiderverse: 1:03:00
- John Wick Chapter 4: 34:00
- Jurassic World: 53:00
- Mad Max: Fury Road: 1:40
- Whiplash: 59:00
- Bohemian Rhapsody: 1:55:00
- Edge of Tomorrow (Live Die Repeat): 19:00
- Rogue One: 1:40:00
We’ll also use two parts of our sound test playlists to test a soundbar’s ability to play music, covering it with the regular Pickr Sound Test and the Dolby Atmos playlist available on Apple Music.
Laptops & computer testing
Like phones, testing a computer can be very subjective and based entirely on how someone uses the computer. To deal with this, Pickr reviewers ensure they use the computer as their daily driver while benchmarking it.
Benchmarks for computers are typically handled using specific apps, such as Geekbench, while workstation-class will end up seeing more graphical benchmarks applied to demonstrate category specific performance.
All laptops and computers tested at Pickr are reviewed in use, and unless otherwise noted, the reviews have been written on those computers.
Wearables testing
When it comes to reviewing smartwatches, smart bands, and other health-focused wearables, we put them through paces by making sure they’re worn and in active use for a period of several days to several weeks, and ensure they’re tested with a variety of features.
Because each wearable differs in feature set, it means testing may be different per device. However, it always includes main features of the device, such as triggering an electrocardiogram (ECG), blood oxygen test (SpO2), heart rate checks, sleep tracking, and may also include controlling music with those devices, as well.
Coffee machine testing
It’s next to impossible to escape a work week without some sort of caffeine, which makes coffee, tea, and the other ways you get your dose of morning buzz so important. At Pickr, we review coffee machines partly because how a coffee machine functions — namely, whether it delivers a good cup easily and without a lot of effort — is important.
Pickr’s editor Leigh :) Stark has been reviewing coffee machines for over 10 years, and spent time learning what type of grounds goes into coffee pods, as well as how the machines work at prior publications.
As a result, he ensures all coffee machine reviews are tested not just with one or two cups of coffee, but several, and often tested with several varieties of milk, all to ensure the quality of the review matches or exceeds the quality of the coffee machine and its resulting product.
Awards & recommendations
Awards and recommendations made by the Pickr team are editorial decisions, and no correspondence with companies, brands, or sales and marketing teams impact our decisions. They are based on experience from using and reviewing the product, and connect with the methodologies on this page.
Pickr’s recommendations occur throughout the year for products that have gained a rating of higher than 4.25 through our reviews. Product reviews with a “Recommended Choice” badge are essentially a product recommended by Pickr.
At the end of the year, we award the best products we’ve used and reviewed a “Best Pick” award, which sees nods given out across a variety of categories, including phones, headphones, speakers, computers, accessories, and more.
Why you can trust us
Pickr has one main purpose: inform through research. Some of it is humorous, but none of it is designed to sell. We might have an ad here and there to help with server costs and maintenance of the site, but unless otherwise noted, no story on Pickr contains affiliate links or paid for promotion.
We experiment with ad placement to keep ad numbers down, because we don’t want you to have to read paragraphs through a sea of ad blocks. That’s not fun or useful for us, and it won’t be for you either. We know that some sites do, but we don’t want that for our own site. It’s just not a useful way to get information.
Everything at Pickr is built to help you pick through research. That’s the point of Pickr, and the very meaning of why we’re called “Pickr”.