Integrating SEO requirements into the design thinking process of a digital studio
Annotation: The article explores the integration of SEO requirements into the design thinking process of a digital studio, with the aim of synchronizing creative solutions with measurable indicators of page quality and search accessibility. At a theoretical level, it analyzes the applicability of the Double Diamond framework and the d.school's five-step process to the tasks of technical and content optimization, as well as compliance with public search engine guidelines. The empirical part is based on industry guidelines that allow for comparison of design solutions with metrics such as LCP/INP/CLS, CTR, and conversions. The article shows that early integration of SEO in the Empathize and Define stages ensures the identification of search intent and information architecture. During the Ideate and Prototype stages, it prepares "SEO-ready" templates and conducts laboratory testing using the Lighthouse tool. Finally, at the Test stage, it verifies suitability for advanced formats and ensures indexing.. The practical significance of this approach is reflected in reducing technical debt, increasing the predictability of organic demand, and improving the manageability of return on investment (ROMI) and customer acquisition cost (CAC) through transparent quality control measures in continuous integration/continuous delivery (CI/CD).
Bibliographic description of the article for the citation:
Simonov Mykhailo. Integrating SEO requirements into the design thinking process of a digital studio//Science online: International Scientific e-zine - 2025. - №11. - https://nauka-online.com/en/publications/information-technology/2025/11/04-35/
Information technology
Simonov Mykhailo
SEO of Vision Lab Studio
(Ukraine, Kharkiv)
https://www.doi.org/10.25313/2524-2695-2025-11-04-35
INTEGRATING SEO REQUIREMENTS INTO THE DESIGN THINKING PROCESS OF A DIGITAL STUDIO
Summary. The article explores the integration of SEO requirements into the design thinking process of a digital studio, with the aim of synchronizing creative solutions with measurable indicators of page quality and search accessibility. At a theoretical level, it analyzes the applicability of the Double Diamond framework and the d.school’s five-step process to the tasks of technical and content optimization, as well as compliance with public search engine guidelines. The empirical part is based on industry guidelines that allow for comparison of design solutions with metrics such as LCP/INP/CLS, CTR, and conversions. The article shows that early integration of SEO in the Empathize and Define stages ensures the identification of search intent and information architecture. During the Ideate and Prototype stages, it prepares “SEO-ready” templates and conducts laboratory testing using the Lighthouse tool. Finally, at the Test stage, it verifies suitability for advanced formats and ensures indexing.. The practical significance of this approach is reflected in reducing technical debt, increasing the predictability of organic demand, and improving the manageability of return on investment (ROMI) and customer acquisition cost (CAC) through transparent quality control measures in continuous integration/continuous delivery (CI/CD).
Key words: design thinking, Double Diamond, d.school, SEO, Core Web Vitals, LCP, INP, CLS, indexability, structured data, schema.org , rich results, Chrome UX Report (CrUX), Lighthouse, Search Console, information architecture, internal linking, CTR, ROMI, CAC/LTV.
Relevance of the study. The relevance of this research stems from the intersection of three significant digital development trends: the increasing demands of search engines for high-quality content (E-E-A-T and Core Web Vitals), the rising expectations of users for visually appealing and accessible interfaces, and the tightening of business metrics for traffic return on investment. In traditional SEO processes, these factors are often addressed separately, leading to delays in optimization and costly redesigns of information architecture, templates, and front-end interfaces. This can result in indexing and performance issues, query cannibalization, and wasted crawl budget.
The integration of SEO requirements into each stage of the design thinking process (Empathize-Define-Ideate-Prototype-Test) allows for the formalization of search intent in personas and scenarios, the fixing of requirements for content structure, internal links, and structured markup at the task-setting stage, as well as the checking of LCP/INP/CLS and accessibility indicators even during the prototyping phase. This reduces technical debt, increases release rates, and enhances the predictability of organic demand.
For digital studios operating in highly competitive markets, with fluctuating search engine algorithms and rising acquisition costs, this integration becomes a significant factor in economic efficiency (ROMI, CAC/LTV), risk management, and service quality. Therefore, the development of a model and techniques for incorporating SEO into design thinking is a crucial scientific and practical challenge that bridges the gap between creative stages and search ecosystem requirements.
The purpose of the study
The purpose of this study is to validate and verify the model for integrating SEO requirements into the design process of a digital agency. This model ensures that the user’s intentions, information architecture, technical limitations, and search engine criteria are aligned. The study will then assess the effects of this integration on Core Web Vitals, visibility, clickthrough rate (CTR), and business metrics for traffic efficiency.
The results of the study. The theoretical foundations of this section are based on two well-established approaches to project work in design and development: the “Double Diamond” model of the British Design Council and the five-stage design-thinking model from Stanford University’s d.school. The Double Diamond describes the divergent-convergent process of research and problem solving and includes four stages – Discover, Define, Develop, and Deliver – which are widely used as a visual framework for innovation. The official materials from the Design Council describe the purpose of each stage and emphasize the universal applicability of the model in various industries, as well as its flexibility for adaptation in design practice [10].
The five-stage d.school model – Empathize, Define, Ideate, Prototype, and Test – is described in the Stanford Process Guide as human-centered, iterative, and non-linear. It allows for parallel stages and repeated cycles, as documented in Stanford’s open source and specialized educational resources on UX [1].
Fig. 1 illustrates the logic of the transitions and examples of activities at each stage. Below, these stages are compared to the measurable requirements of the search ecosystem.
Fig. 1. Stanford d.school design thinking process: Empathize, Define, Ideate, Prototype, Test, and characteristic practices [9]
The practical implementation of theoretical foundations in digital studios today inevitably affects structured data and indexing. Search engines use markup to better understand content, and Schema.org, a community-supported dictionary family, is designed for embedding structured data on web pages [8].
From a conceptual integration model perspective, it is important that design frameworks such as Double Diamond and five-stage design thinking are compatible with each other, as they follow the logic of transition from research to solution verification. SEO requirements in Google’s open guidelines fall on these transitions as verifiable quality criteria.
Table 1 shows generally accepted thresholds for Core Web Vitals, as defined by Google’s official documentation. These thresholds set the boundaries between “good,” “need improvement,” and “bad,” and are used as quality targets by the industry.
Table 1
Common Core Web Vitals Thresholds
| Metric | Good | Improvement is needed | Badly |
| LCP (Largest Contentful Paint) | ≤ 2.5 c | 2.5-4.0 c | > 4.0 c |
| INP (Interaction to Next Paint) | <200 ms | 200-500 ms | > 500 ms |
| CLS (Cumulative Layout Shift) | ≤ 0.1 | 0.1-0.25 | > 0.25 |
Source: [3]
Research questions and hypotheses are based on public empirical guidelines regarding clickability, “zero clicks”, and the impact of speed on user behavior. Industry survey data confirms the “steep CTR curve”, where the top search engine results page (SERP) positions multiply the likelihood of a click. However, a significant proportion of searches are completed without visiting websites, setting the stage for evaluating the effects of studio interventions.
From here, we formulate concise questions: How does integrating search engine optimization (SEO) requirements into the design process help increase the likelihood of appearing in top positions and advanced formats, mitigate “zero-click” losses, and reduce rejection by improving speed and responsiveness? The hypotheses being tested are based on patterns established in open sources, such as: position growth and access to richer formats are associated with higher CTR; improved Core Web Vital metrics are associated with lower failure rates and traffic retention [2].
Fig. 2 illustrates the significant advantage of the top positions and the declining dynamics of clicks towards the bottom of search results, highlighting the impact of incorporating SEO requirements into the design process (enhancing visibility and accessibility to advanced formats).
Fig. 2. Average organic CTR by position in search results [11]
As part of practical integration, it is useful to establish public numerical benchmarks that development and SEO teams can use as basic prioritizers. The weighting coefficients for metrics in Lighthouse determine the contribution of each indicator to the “performance assessment” in the laboratory, which helps focus on optimization at the prototype level and identify which changes will have the most noticeable effect on the final score, especially during decision-making at the ideate/prototype boundary (Fig. 3).
Fig. 3. Distribution of contributions of Lighthouse (v10) metrics to the overall Performance Score [5]
An analysis of open industry sources reveals that the decomposition of the LCP into its sub-components, such as TTFB (time to first byte), delay in the start of loading an LCP resource, and duration of loading a resource, provides measurable guidelines for design decisions. According to the Web Almanac 2024 (HTTP Archive), median values for the “good,” “improvement needed,” and “bad” groups show that the main contributor to LCP performance is the server delay, rather than the actual length of the image download. As the LCP status worsens, both the TTFB and delay in starting the resource download increase dramatically. This confirms the importance of optimizing the request/response time and ensuring the “static detectability” of LCP elements at the markup and critical resource levels [7].
Table 2
Decomposition of p75 LCP into components
| The P75 LCP Group (July 2024) | TTFB, ms | Download start delay, ms | Download time, ms | Rendering delay, ms |
| Good | 600 | 350 | 160 | 230 |
| Improvement is needed | 1360 | 720 | 270 | 310 |
| Badly | 2270 | 1290 | 350 | 360 |
Source: [7]
The guidelines obtained allow us to understand the contribution of design decisions made at different stages of the design process to the measured quality indicators. By dividing the LCP (Large Contentful Paint) into sub-components, we can see that reducing TTFB (Time to First Byte) and reducing the time it takes to load the LCP resource are still priorities. These components are the ones that most significantly move the page between “good”, “need improvement”, and “bad” categories.
Therefore, when transferring knowledge from literature into the studio process, some of the most effective practices include initializing the LCP element early in HTML, minimizing blocking resources, and providing network/cache optimization (preload, prioritization, critical CSS). These practices correlate with visual histograms showing the size distribution of images in the LCP, as well as with the emphasis on “static detectability” in the Web Almanac.
The positive trajectory of CLS in CrUX allows us to interpret sustained improvements as a result of the widespread adoption of layout stability techniques. However, differences between devices and over time suggest that the portability of these solutions depends on the specific layout patterns, behavior of advertising and analytics scripts, and discipline of image and font design. These findings are consistent with industry reports on the impact of third-party scripts and mainstream blockages, which directly affect INP/LCP metrics and overall search visibility.
The relationship between speed and conversion, as documented in the Deloitte/Think with Google study, provides a managerial basis for linking project milestones to business indicators. Improving Core Web Vitals metrics on prototypes and releases should be considered not as “technical debt” but as an investment with measurable returns.
At the same time, the effect of these improvements depends on the specific vertical and competitive environment. Interpretation should take into account factors such as seasonality and changes in search engine results page (SERP) layouts [6].
Google and SearchPilot have found that an increase in click-through rate (CTR) and/or organic traffic can be achieved with high-quality rich results. This means that not only is the validity of the markup important, but also the compliance with content and display policies is crucial. This shift in focus for SEO integration means that it is essential to start by determining the type of pages and designing content blocks that are optimized for search-supported formats. When combined with data from the Web Almanac, the results suggest that a mixed approach is needed: technical optimization of critical rendering paths for large contentful pages (LCP), input/output (INP), and semantic optimization of content presentation to improve clickability [4].
The limitations of interpretation arise from the fact that the data presented in this analysis is aggregated across the entire ecosystem (CrUX and HTTP Archive) or represents specific cases. To apply this information to individual projects, it is necessary to compare it with their own RUM (Real User Monitoring) data, Search Console reports, and take into account factors beyond the team’s control (such as algorithm changes and the introduction of new search engine results pages (SERPs).
Despite these limitations, the consistency of trends across different sources – the increasing proportion of “good” Core Web Vitals (CLS), the predominant role of Time to First Byte (TTFB) in Largest Contentful Paint (LCP), the positive correlation between speed and conversions, and the impact of rich search results – provides a basis for making practical conclusions about the viability of integrating SEO requirements early in the design process of digital studios.
Turning to practical recommendations, it would be advisable to integrate SEO into project management: including mandatory milestones in roadmaps to agree on intentions and structure sections, assigning roles for filling out key templates and quality control, and introducing uniform rules for describing pages and links for the team. This would allow you to link the design of the interface with the requirements of the search ecosystem in advance and remove the risk of late improvements.
For operational work, it would be useful to standardize assets: a single brief format with a search script, a catalog of reusable blocks with pre-prepared markup options, a publication checklist with status check responses, canonicalization, links to language versions, and site maps. We recommend automating routine checks during the building process, keeping a record of changes affecting visibility, and providing a quick rollback plan for releases that affect critical pages.
To ensure the long-term success of our efforts, it is essential to establish a cycle of monitoring and continuous improvement. This includes regularly reviewing reports on the performance of our advanced formats, analyzing the visibility and click-through rates of our content, comparing demand with our content calendar, and planning experiments to test hypotheses for new output elements and key page blocks.
By implementing these procedures and tracking them through performance metrics, we can ensure that our SEO efforts remain effective and predictable. This will help us minimize traffic loss and systematically increase our impact by integrating SEO into our design thinking process. Additionally, by responding quickly to incidents and issues, we can maintain a high level of quality in our work.
Conclusions. The integration of SEO as a comprehensive process through the Empathize-Define-Ideate-Prototype-Test phases allows you to transform search engine requirements into a set of verified artifacts and checkpoints, from intent maps and internal link diagrams to validated structured data and target thresholds for LCP/INP/CLS. A comparison with industry data shows that reducing TTFB (Time to First Byte) and ensuring early availability of the LCP (Large Contentful Paint) element contribute most to improving the user experience, while valid markup and actual rich impressions are correlated with an increase in CTR (Click-Through Rate). The inclusion of these practices in the development process (Lighthouse/Lighthouse Continuous Integration, Search Console, Rich Results Testing) reduces overwork, minimizes the risk of lost crawl budget and query cannibalization, enhances the predictability of organic traffic, and aligns link design solutions with economic outcomes (Return on Marketing Investment, Customer Acquisition Cost/Lifetime Value).
Therefore, the systematic integration of SEO into the design process is essential for the long-term success and viability of digital products in terms of both technology and business.
References
- An Introduction to Design Thinking PROCESS GUIDE. Access mode: https://web.stanford.edu/~mshanks/MichaelShanks/files/509554.pdf.
- Google Organic CTR Tool.– ccess mode: https://www.advancedwebranking.com/free-seo-tools/google-organic-ctr.
- How the Core Web Vitals metrics thresholds were defined. Access mode: https://web.dev/articles/defining-core-web-vitals-thresholds.
- Introduction to structured data markup in Google Search. Access mode: https://developers.google.com/search/docs/appearance/structured-data/intro-structured-data.
- Lighthouse performance scoring. Access mode: https://developer.chrome.com/docs/lighthouse/performance/performance-scoring.
- Milliseconds make Millions. Access mode: https://www.thinkwithgoogle.com/_qs/documents/9757/Milliseconds_Make_Millions_report_hQYAbZJ.pdf.
- Performance | 2024 | The Web Almanac by HTTP Archive. Access mode: https://almanac.httparchive.org/en/2024/performance.
- org. Access mode: https://schema.org/.
- Stanford d.school Design Thinking Process 5 Steps – Wiefling Consulting. Access mode: https://wiefling.com/design-thinking-powerful-methodology-projects-thinking-design/stanford-d-school-design-thinking-process-5-steps.
- The Double Diamond – Design Council. Access mode: https://www.designcouncil.org.uk/our-resources/the-double-diamond/.
- Top 9 Most Important SEO Metrics To Track. Access mode: https://www.similarweb.com/blog/marketing/seo/seo-metrics/.
editor@inter-nauka.com


Personal cabinet
Download the article (pdf)
Comments off
Comments are closed.
To comment on the article - you need to download the candidate degree and / or doctor of Science