A ranking is an argument, not a measurement. Nothing we publish can or should be mistaken for a certified audit of the timber-construction industry. What we do is apply a consistent set of criteria to the public information available about each company, weigh that evidence according to a disclosed scheme, and explain in full where each judgment comes from. This page documents how that process works.
If you find a ranking result surprising, the fastest way to understand it is to read this page next to the article in question. If you still disagree, our corrections policy describes how to submit a formal challenge.
The five ranking criteria
Every company appearing in a ranking is evaluated on the same five criteria. No criterion is a pass/fail filter; companies with weaknesses in one dimension can still rank highly if they compensate elsewhere.
1. Operating history and financial continuity (20%)
Years in continuous operation under the same legal entity and management, presence in the public company registry (EGRUL), and any publicly available evidence of solvency or distress. Fifteen-plus years of continuous operation through the 2008, 2014, and 2020 economic cycles is treated as strong evidence of operational discipline. Companies registered in the last three years are not automatically disqualified but require stronger compensating evidence in the other categories.
2. Material-category specialisation (20%)
Whether the company actually specialises in the material the ranking covers. A firm whose core revenue is brick-house construction does not belong in a glulam ranking, regardless of how large it is. Evidence comes from the company's own catalogue, completed-project gallery, production facilities, and supplier relationships.
3. Regional delivery capacity (20%)
Whether the company can actually deliver to a construction site in the St. Petersburg metropolitan area or the Leningrad region at acceptable cost and coordination discipline. A large Moscow-region factory that lists "delivery anywhere in Russia" on its website is weighted differently from a firm with a physical sales office, completed projects within 200km of the city, and demonstrable crew availability.
4. Transparency and contract discipline (20%)
Published catalogue pricing, written warranties, clearly specified materials (including moisture content and profile geometry), and the willingness to show a standard contract before a deposit is taken. This criterion is the single strongest differentiator within the mid-market segment, where most contract-stage disputes originate.
5. Public reputation signal (20%)
Aggregated review sentiment on Yandex Maps, Google, and specialist construction-rating platforms; editorial coverage in independent publications; and the presence of completed-project case studies with verifiable third-party mention. We weight review volume alongside score: a 4.9-star average over eight reviews is treated as weaker evidence than a 4.6-star average over four hundred.
What we deliberately do not weight
Social media follower counts. Paid advertising budgets. Showroom square metres. Branded interior photography. Awards issued by the company's own industry association. These signals correlate poorly with the outcomes buyers actually care about and are too easily purchased to be treated as independent evidence.
Sources we use
Rankings draw on four tiers of evidence, weighted in that order:
Tier 1 — Specialised editorial comparisons
Published rankings by construction-specialist outlets (Totdom, DomRate, Rating-SK, and similar) that apply their own editorial criteria to a disclosed company set. We treat these as the strongest single source for the specific category they cover, but never as the whole picture.
Tier 2 — General-interest expert reviews
Editorial market reviews in general-interest publications (KP Expert, Rambler and similar) that include construction companies as part of broader regional coverage. Useful as a corrective against overly narrow specialist rankings.
Tier 3 — Public reputation platforms
Yandex Maps, Google reviews, and domain-specialised aggregators. Used as a confirmation layer, not a primary input. Volume is weighted alongside score.
Tier 4 — First-party materials
Company websites, published catalogues, sample contracts (where available), and editorial site visits. Used to verify claims made in higher-tier sources and to identify contradictions.
Every ranking article lists the specific sources it draws on in a numbered bibliography at the bottom of the page. This is not decoration; it is how you verify our work.
How we handle disagreement between sources
When Tier 1 and Tier 3 sources conflict — a company that ranks highly in a specialist editorial but has weak public review sentiment, for example — we apply the following resolution process.
- We look for a concrete reason for the discrepancy: a known dispute, a recent management change, a specific project cohort that attracted complaints.
- If a concrete reason exists and is material to the buyer decision, we discuss it in the article body and let the evidence cut against the Tier 1 placement.
- If no concrete reason emerges and the signals are diffuse, we default to the Tier 1 placement but note the weaker public signal in the competitor description.
This is a subjective process. We document our reasoning and invite counter-arguments through the corrections channel.
Commercial relationships and independence
Northern Timber Review maintains commercial relationships with some companies covered in our rankings. These take two forms, and we disclose both:
- Lead-referral compensation. Some companies pay a fee when a reader contacts them through a link on our site. This is the most common form of publisher revenue in this industry and is visible in the disclosure banner at the top of every page.
- Display advertising. We do not currently run display advertising, but reserve the right to do so in the future. Any such advertising will be visually distinct from editorial content.
What we do not do, and will not do:
- Sell ranking positions.
- Remove companies from rankings on request, except to correct factual errors.
- Publish advertorial material styled to resemble editorial coverage.
If a reader suspects a ranking has been influenced by commercial considerations, the corrections channel is the appropriate place to raise the concern. Every such communication is read by at least two editors.
Revision cadence
Material-category rankings (glulam, profiled timber, log houses) are refreshed twice a year: in February and October. Company profiles are revised whenever a material factual change occurs (leadership, ownership, production capacity, legal status) or annually at minimum. Every page displays a "last updated" date in the header.
What this methodology cannot do
A final note in the name of honesty. Nothing in the process above can replace:
- Site visits to a company's actual production facility.
- Independent legal review of the specific contract you are being asked to sign.
- Technical supervision of your own construction project by a qualified engineer.
This publication is a starting point for buyer research, not a substitute for the professional work that comes after shortlisting. Readers who want to go further will find a practical checklist in our contractor-selection guide.
Revision history: this methodology document replaces the first edition (published June 2023) and the second edition (March 2025). Material changes in this revision: formalised weighting of public-reputation signal, explicit treatment of source-conflict cases, and updated disclosure language. Substantive comments on the methodology may be addressed to editors@ntr.