- Department of Communication Sciences, Faculty of Social and Political Sciences, Hasanuddin University, Makassar, Indonesia
Integrating artificial intelligence into journalism has transformed how gender identities are represented and consumed across digital platforms. This perspective examines the emergence of “platform-specific masculinities” in Indonesian digital journalism, drawing from recent empirical evidence showing a 55-percentage-point decline in traditional masculine representations between 2019 and 2024. We propose the “Algorithmic Gender Representation Paradigm” (AGRP) as a theoretical framework emerging from Indonesian contexts for understanding how platform-specific affordances, machine learning algorithms, business models, and cultural contexts interact to influence journalistic content and audience engagement. Analysis of 240 h of television programming and 1,100 digital media items reveals that digital platforms demonstrate 111% higher engagement rates for emotionally expressive content than traditional masculine representations, particularly among audiences aged 18–24. While television maintains predominantly traditional representations (65%), platforms like TikTok show significantly higher proportions of emotional (42%) and creative (45%) expressions. These patterns reflect not only algorithmic affordances but also divergent business models: advertiser-funded platforms optimizing for engagement versus broadcast television navigating regulatory constraints. Drawing on platform studies, feminist technology scholarship, and glocalization theory, we challenge assumptions about algorithmic neutrality and highlight the need for culturally sensitive AI development in journalism. We identify critical gaps in cross-cultural algorithmic bias studies and propose methodological approaches for examining long-term societal impacts. The perspective concludes that understanding algorithmic influence on gender representation requires interdisciplinary collaboration, integrating communication studies, computer science, gender studies, and area studies to ensure that digital transformation serves democratic values and promotes culturally sensitive representation across diverse global contexts.
1 Introduction
The Reuters Institute Digital News Report 2024 reveals that 78% of 200 digital leaders, editors, and CEOs surveyed believe investment in artificial intelligence technology will be key to journalism’s survival (Newman, 2024). This unprecedented technological shift extends beyond operational changes to encompass profound alterations in how journalistic content constructs, reflects, and potentially transforms societal norms, particularly regarding gender representation. The migration of content from traditional broadcast media to algorithm-driven digital platforms has created substantively different representational environments with distinct engagement dynamics that demand new theoretical frameworks and methodological approaches.
Indonesia, as the world’s largest Muslim-majority nation with over 270 million inhabitants and a burgeoning digital economy valued at over $130 billion, provides a particularly valuable context for examining these dynamics. Recent analysis of Indonesian media demonstrates that digital transformation has catalyzed significant shifts in gender representation patterns, with traditional masculine representations declining from 85 to 30% between 2019 and 2024 (Sonni et al., 2025). This dramatic transformation coincides with 73% of global news organizations adopting AI technology (Sonni, 2025), suggesting that algorithmic systems may accelerate cultural change at unprecedented rates.
This rapid transformation raises fundamental questions about the relationship between technological systems and cultural change. Do platform algorithms merely amplify existing cultural trends, or do they actively shape the direction and pace of social transformation? How do global platform architectures interact with local cultural contexts to produce geographically specific patterns of representation? What role do platform business models play in determining which content gains visibility and engagement?
However, the implications of technological adoption for gender representation remain insufficiently theorized, particularly in non-Western contexts where traditional cultural values intersect with rapid digitalization. Existing research on AI in journalism focuses predominantly on Western media contexts (Amponsah and Atianashie, 2024; Pavlik, 2023), with insufficient attention to how algorithmic systems function in diverse cultural environments. This Western-centric focus creates significant gaps in understanding how platform algorithms interact with local gender norms, religious identities, and cultural expectations to produce geographically specific patterns of representation. Moreover, existing scholarship has inadequately addressed how platform business models, advertiser-funded versus state-supported, engagement-optimizing versus regulatory-constrained, shape the gender representations that gain algorithmic visibility (Srnicek, 2017; van Dijck et al., 2018).
This perspective article addresses these gaps by examining how algorithmic systems, platform business models, and culturally specific platform affordances shape gender representation in digital journalism, with particular attention to Indonesian contexts. We propose the Algorithmic Gender Representation Paradigm (AGRP) as a novel theoretical framework that emerges from Indonesian empirical observations while drawing on insights from platform studies, feminist media theory, and artificial intelligence ethics to propose mechanisms potentially applicable across diverse cultural contexts, pending systematic validation. The analysis synthesizes recent empirical findings to advance three central arguments: first, that platform algorithms function as active agents in shaping gender discourse rather than neutral distribution mechanisms; second, that the interaction between cultural context, platform political economy, and algorithmic systems produces geographically specific patterns requiring culturally informed analytical approaches; and third, that understanding these dynamics necessitates new methodological frameworks transcending traditional media studies boundaries.
The Indonesian case offers theoretical value for several reasons. First, as a Muslim-majority nation experiencing rapid digitalization, Indonesia represents contexts where global platform technologies intersect with non-Western cultural values, enabling examination of glocalization dynamics (Robertson, 1995; Kraidy, 2005). Second, Indonesia’s substantial digital economy scale ($130 billion) and large population (270 million) ensure that observed patterns reflect significant social phenomena rather than marginal cases. Third, Indonesia’s media landscape encompasses both state-funded broadcasters operating under public service mandates and commercial platforms navigating between ratings pressures and regulatory oversight, enabling comparative analysis of how different institutional structures shape gender representation. While these characteristics make Indonesia valuable for theory development, we acknowledge that framework validation requires systematic comparative research across diverse cultural contexts.
The article proceeds in four sections. Following this introduction, we present the AGRP framework and examine platform-specific patterns of gender representation, incorporating analysis of platform business models and regulatory contexts. We then discuss algorithmic bias and cultural sensitivity challenges in journalistic AI development. Finally, we identify critical research gaps and propose future directions for investigating the societal implications of AI-driven gender representation in journalism across diverse cultural contexts.
2 Theoretical foundations and the algorithmic gender representation paradigm
2.1 Platform studies and technological affordances
Understanding how digital platforms shape gender representation requires engaging with platform studies scholarship examining how technological architecture structures social practices and cultural production. Gillespie (2010) Foundational work establishes platforms not as neutral intermediaries but as “computational, economic, and political architectures” that actively shape the information they distribute. Platform affordances, the action possibilities enabled by technological features, interact with user practices and cultural contexts to produce distinctive communicative environments (Bucher, 2018).
Different platforms embody distinct affordances shaping gender representation possibilities. TikTok’s emphasis on short-form video, algorithmic prioritization of content over follower counts, and “For You Page” recommendation system create environments where diverse content can achieve viral reach regardless of creator status (Abidin, 2021). Instagram’s visual-centric design and influencer culture foster forms of esthetic self-presentation (Duffy and Hund, 2015). YouTube’s longer-form video format and subscription-based distribution enable different narrative structures and audience relationships (Burgess and Green, 2018). Traditional television’s scheduled programming, regulatory oversight, and broadcast model create fundamentally different representational constraints (Lotz, 2017).
These platform-specific affordances do not determine content in simplistic ways but rather create structured possibilities within which cultural production occurs. As Plantin et al. (2016) we argue, platforms function as infrastructures, simultaneously technical, organizational, and socio-political systems. Understanding gender representation in digital journalism, therefore, requires analyzing how platform architectures interact with journalistic practices, audience behaviors, and cultural norms to produce observable patterns.
2.2 Feminist technology studies and algorithmic gender
Feminist scholarship on technology provides essential foundations for understanding gender dynamics in algorithmic systems. Wajcman (2009) Argues that technologies are not gender-neutral tools but rather embody and reproduce gender relations through design choices, implementation practices, and social contexts of use. This perspective challenges technological determinism while recognizing that technologies can have meaningful social effects depending on how they are configured and deployed.
Recent feminist work on algorithms specifically has documented how machine learning systems can perpetuate and amplify gender biases present in training data, developer assumptions, and optimization metrics (Noble, 2018; Eubanks, 2017). Noble’s Algorithms of Oppression demonstrates how search algorithms systematically marginalize women of color through biased training data and discriminatory associations Benjamin (2020). Race After Technology extends this analysis to show how automated systems can encode existing social hierarchies while appearing objective and neutral.
However, feminist technology studies also recognize possibilities for algorithmic systems to challenge existing gender norms under certain conditions. When algorithms optimize for engagement rather than advertiser sensibilities or regulatory compliance, they may inadvertently favor content that challenges traditional gender representations if such content generates strong audience responses. This suggests that algorithmic gender effects depend significantly on business models, regulatory contexts, and the specific metrics algorithms optimize for questions requiring empirical investigation across diverse contexts.
2.3 Glocalization and cultural adaptation of digital technologies
The concept of glocalization, the dynamic interplay between globalizing and localizing forces, provides crucial frameworks for understanding how global platform technologies operate in diverse cultural contexts (Robertson, 1995; Kraidy, 2005). The hybridity framework emphasizes that cultural globalization involves neither simple homogenization nor pure resistance but rather complex processes of selective appropriation, creative adaptation, and negotiated meaning.
Applied to digital platforms, glocalization theory suggests that global technological architectures interact with local cultural practices to produce geographically specific patterns. Iwabuchi (2002) Work on Asian media demonstrates how cultural products circulate transnationally while being reinterpreted through local frameworks. Similarly, platform algorithms developed in Silicon Valley operate differently when deployed in Indonesian contexts where Islamic values, family structures, and gender norms differ substantially from Western reference points.
This theoretical perspective challenges universalist assumptions embedded in much technology scholarship while avoiding cultural relativism that would deny any cross-cultural patterns. It suggests that platform effects on gender representation will show both universal tendencies (driven by shared algorithmic logics) and culturally specific variations (driven by local appropriation and resistance), requiring comparative research to distinguish these dynamics.
2.4 Platform political economy and business models
Understanding algorithmic gender representation requires analyzing the economic structures and business models shaping platform operations. Srnicek (2017) The Platform Capitalism framework identifies how digital platforms extract value through data collection, network effects, and algorithmic optimization, creating business models fundamentally different from traditional media. van Dijck et al. (2018) Platform Society extends this analysis to examine how platform logics reshape social institutions, including journalism.
Global social media platforms operate primarily as advertising-funded businesses where algorithms optimize for metrics maximizing commercial value: user engagement, watch time, click-through rates, and advertiser targeting precision (Zuboff, 2019). These commercial imperatives shape what content gains visibility. Algorithms prioritize content generating strong emotional responses, extended viewing sessions, or intensive user interactions regardless of whether such content aligns with traditional cultural values or challenges existing norms.
This political economic structure contrasts sharply with traditional broadcast media models. Indonesian state broadcaster TVRI operates under public service mandates with accountability to government ministries rather than advertisers. Commercial television stations like RCTI navigate between ratings pressures and regulatory oversight from the Indonesian Broadcasting Commission (KPI), which enforces content standards reflecting dominant cultural values. These institutional differences create divergent incentives regarding gender representation: platforms optimizing for engagement may favor emotionally expressive content challenging traditional masculinity if such content generates user interaction, while broadcast television, facing regulatory scrutiny, may maintain conservative representations to avoid sanctions.
The political economy of platforms also shapes content moderation and algorithmic amplification decisions in ways that affect gender representation. Platform content policies often developed in Western corporate contexts may not align with local cultural norms, creating tensions around what content is promoted, demoted, or removed (Gillespie, 2018). Understanding algorithmic gender representation, therefore, requires analyzing not only technical systems but also the business models, regulatory environments, and institutional structures within which those systems operate.
2.5 The algorithmic gender representation paradigm: a synthetic framework
The transformation of journalism through digital platforms necessitates a fundamental reconceptualization of how gender representation operates within media systems. Traditional media effects and representation models, developed primarily in broadcast contexts, prove inadequate for understanding complex interactions between algorithmic curation, audience engagement metrics, and content evolution characterizing contemporary digital journalism. The Algorithmic Gender Representation Paradigm (AGRP) addresses this theoretical gap by synthesizing insights from platform studies, feminist technology studies, and glocalization theory with empirical observations from Indonesian digital journalism contexts.
Building on the theoretical foundations established above, the AGRP rests on four foundational principles. First, algorithmic systems do not merely reflect existing gender norms but actively participate in their construction through visibility, engagement optimization, and content recommendation mechanisms. Empirical evidence demonstrates that emotional and creative content receives 380,000 and 320,000 interactions on platforms like TikTok, compared to only 180,000 interactions for traditional masculine content (Sonni et al., 2025). These differential engagement rates create feedback loops influencing content production strategies, effectively incentivizing certain forms of gender expression while disincentivizing others. AI systems can analyze millions of data points in seconds and predict emerging news trends (de-Lima-Santos and Ceron, 2021), creating possibilities for rapid shifts in representational norms outpacing traditional cultural change mechanisms. This algorithmic agency operates through multiple mechanisms: personalized content recommendations exposing users to representations they might not encounter through traditional gatekeeping; engagement metrics providing immediate feedback to content creators about audience reception; and automated content moderation systems determining what representations circulate publicly (Gillespie, 2014; Bucher, 2018).
Second, the AGRP recognizes that platform affordances and business models create distinct representational environments with geographically and culturally specific characteristics. While television maintains predominantly traditional masculine representations at 65%, digital platforms, especially TikTok, demonstrate significantly higher proportions of emotional (42%) and creative (45%) expressions. This divergence suggests that platform-specific technical features, interface designs, and algorithmic recommendation systems interact with local cultural contexts to produce unique patterns of gender representation. Critically, these patterns also reflect divergent economic logics: advertiser-funded platforms optimizing algorithms for engagement metrics that may favor emotionally resonant content regardless of cultural conservatism, versus broadcast television navigating regulatory constraints from the Indonesian Broadcasting Commission (KPI) that enforce alignment with dominant cultural values (Lim, 2012). The concept extends platform studies by examining specific mechanisms through which technological and economic characteristics influence gender representation, while extending feminist media theory by incorporating algorithmic systems and platform business models as relevant analytical objects.
Third, the paradigm emphasizes the temporal dynamism of algorithmic gender representation. The documented shift from 85% traditional masculine representation in 2019 to 30% in 2024 indicates that algorithmic systems can accelerate cultural transformation at rates previously uncommon in media history. This temporal compression has profound implications for understanding the media’s role in social change, suggesting that the transition to algorithmically curated environments may fundamentally alter the dynamics of cultural transformation. However, questions remain about whether this acceleration represents permanent transformation or transitional instability that may stabilize or reverse under changing conditions (Plantin et al., 2016).
Fourth, the AGRP emphasizes glocalization dynamics, how global platform architectures interact with local cultural contexts to produce geographically specific patterns. While platform algorithms embody logics developed primarily in Western (particularly Silicon Valley) contexts, their operation in Indonesian environments occurs within distinctive cultural frameworks, including Islamic values, family structures, and gender norms that differ substantially from Western reference points (Kraidy, 2005). This interaction produces hybrid representational patterns that cannot be understood through either technological determinism (algorithms simply imposing Western values) or cultural autonomy (local cultures freely appropriating technologies). Rather, gender representation emerges from ongoing negotiation between algorithmic logics and cultural contexts, requiring analysis attending to both universal platform dynamics and culturally specific adaptations.
AGRP’s practical application demonstrates how AI technologies can enhance rather than diminish religious and cultural identity in professional branding when appropriately calibrated for cultural sensitivity. Analysis of Indonesian hijabi entrepreneurs reveals that posts containing Islamic references receive 23% higher engagement rates and 18% more positive sentiment scores than purely secular content (Putri and Sonni, 2023), directly contradicting assumptions embedded in many Western-developed AI systems. This finding highlights the necessity of culturally informed algorithm development and evaluation frameworks.
The framework provides conceptual foundations for understanding how digital transformation alters fundamental dynamics of gender representation in journalism while remaining potentially applicable across diverse cultural contexts pending systematic validation. We emphasize that AGRP emerges from Indonesian empirical observations and proposes mechanisms for understanding algorithmic gender representation in contexts where global platforms intersect with non-Western cultural values. While these mechanisms may operate in other contexts, claiming universal applicability requires comparative research across diverse cultural settings. The framework’s value lies not in immediate generalizability but in providing a theoretically grounded analytical approach that can be tested, refined, and potentially adapted for other contexts. Its emphasis on interaction between algorithmic systems, platform business models, cultural values, and audience behavior offers a dynamic model accounting for rapid representational shifts observed empirically while highlighting mechanisms through which platform characteristics shape content evolution and reception.
Table 1 reveals striking platform-specific variations that cannot be explained by content characteristics alone but rather reflect the interaction of platform affordances, business models, and cultural contexts theorized in AGRP. The 52-percentage-point difference in traditional masculine representation between television (65%) and TikTok (13%) reflects not only algorithmic recommendation differences but also divergent institutional logics: broadcast television operating under regulatory oversight, enforcing cultural conservatism, versus advertiser-funded platforms optimizing for engagement metrics that may favor emotionally expressive content generating strong user responses. Similarly, the substantially higher average engagement on digital platforms (333 k interactions) compared to television (3.2% rate) reflects different metrics and measurement systems but also different audience relationships and interaction possibilities enabled by platform architectures (Gillespie, 2010).
3 Methodological approach and data sources
The empirical observations discussed in this Perspective derive from two primary studies conducted by the author and colleagues between 2019 and 2024, examining gender representation evolution across Indonesian media platforms and AI-driven personal branding among hijabi entrepreneurs. Providing methodological transparency ensures readers can evaluate the evidentiary basis for theoretical claims while understanding the scope and limitations of empirical findings.
3.1 Gender representation study
The first study employed systematic content analysis examining masculine representation across Indonesian television and digital platforms (Sonni et al., 2025). The sample included 240 h of reality television programming from five major shows selected based on Nielsen ratings data and consistent cross-platform presence: Indonesian Idol (RCTI, 48 h), MasterChef Indonesia (RCTI, 48 h), The Voice Indonesia (GTV, 40 h), Super Deal Indonesia (Indosiar, 40 h), and Big Brother Indonesia (RCTI, 64 h). These programs span talent competitions, cooking shows, and reality competitions, representing diverse genres while maintaining a minimum 10% market share viewership thresholds.
Digital platform analysis examined 1,100 items distributed as follows: 100 full YouTube episodes (minimum 100,000 views each), 200 Instagram highlights (minimum 10,000 engagements), 300 TikTok clips (minimum 50,000 views), and 500 Twitter discussion threads (minimum 100 interactions each). This multi-platform sampling enabled tracking how content evolves and is received across different algorithmic environments and audience demographics.
Content was coded for three masculine representation categories based on behavioral and narrative indicators:
• Traditional Masculinity: competitive achievement emphasis, leadership displays, physical prowess, conventional family provider roles - Emotional Masculinity: vulnerability expression, interpersonal connection emphasis, empathy displays, personal growth narratives
• Creative Masculinity: artistic expression, innovation emphasis, non-conventional presentation, alternative lifestyle choices
Dual independent coders analyzed content using detailed rubrics with specific behavioral examples and contextual considerations. Intercoder reliability achieved Krippendorff’s alpha = 0.87, indicating strong agreement. Discrepancies were resolved through discussion and referral to a third senior coder when necessary.
Survey data from 1,000 respondents provided audience perspective, with demographic distribution reflecting Indonesia’s digital media user landscape based on APJII Internet User Survey data: 35% aged 18–24, 30% aged 25–34, 21% aged 35–44, and 14% aged 45+. Gender distribution included 52% male, 45% female, and 3% non-binary respondents. Geographic representation spanned 14 major Indonesian cities across Java, Sumatra, and Sulawesi.
Engagement metrics (views, likes, comments, shares, viewing time) were collected monthly from January 2019 through December 2024, with minimum thresholds ensuring statistical significance: 10,000 views, 1,000 likes, 100 comments, 50 shares, and 1,000 min total viewing time per content item.
3.2 Hijabi entrepreneur study
The second study examined AI-driven personal branding among Indonesian hijabi entrepreneurs through sentiment analysis of 2,847 social media posts and in-depth interviews with 35 entrepreneurs across technology, e-commerce, fashion, food, education, and healthcare sectors. Entrepreneurs were selected through startup accelerator databases, government entrepreneurship programs, and social media hashtag analysis, with criteria including active startup operation for a minimum of 12 months, visible hijab wearing in profile photos, and a minimum of 1,000 social media followers (Putri and Sonni, 2023).
Natural language processing analyzed Islamic terminology integration, content categories, and audience engagement patterns. Computer vision assessed hijab visibility and styling patterns in visual content. Sentiment scores were calculated using Indonesian-language trained models incorporating Islamic terminology to ensure cultural accuracy. Semi-structured interviews (60–90 min each) explored personal branding strategies, cultural identity management, AI tool usage, and entrepreneurial challenges specific to religious identity.
3.3 Analytical limitations
These methodological approaches enable analysis of both content patterns (what representations appear) and reception dynamics (how audiences engage), providing complementary perspectives on algorithmic gender representation. However, several limitations constrain interpretation:
• The geographic focus on Indonesia limits generalizability to other cultural contexts
• Five-year temporal window may not capture longer-term evolutionary patterns
• Reliance on publicly available social media data is subject to platform API restrictions and access limitations
• Content analysis captures observable representations but not production processes or creator intentions
• Survey and interview data reflect reported perceptions rather than actual behaviors
• Platform algorithm opacity prevents direct examination of the recommendation system mechanics
These limitations underscore the need for comparative research across diverse contexts and longer temporal horizons to validate and refine the theoretical frameworks proposed here.
4 Platform specificity and algorithmic bias in journalistic AI
Empirical evidence reveals profound platform-specific variations in gender representation that cannot be adequately explained through content alone. Analysis demonstrating that identical content receives dramatically different audience responses depending on distribution platform highlights the necessity of examining platform-specific technical and economic affordances as crucial determinants of representational outcomes. Television’s one-to-many communication models, rigid scheduling requirements, and advertiser-driven revenue structures create economic and organizational incentives favoring familiar, broadly acceptable content over experimental representations. Moreover, Indonesian television operates under regulatory oversight from the Indonesian Broadcasting Commission (KPI), which enforces content standards aligned with dominant cultural values, creating institutional pressures toward conservative gender representations (Lim, 2012).
In contrast, TikTok’s emphasis on short-form video, algorithm prioritizing engagement over follower counts, and predominantly younger user base create environments where emotional and creative masculine expressions find substantial audiences. Critically, TikTok’s business model optimizes algorithms for user engagement metrics (watch time, completion rate, interaction frequency) rather than regulatory compliance or advertiser sensibilities, creating economic incentives favoring content that generates strong emotional responses regardless of whether such content aligns with traditional cultural norms (Abidin, 2021). YouTube occupies an intermediate position, with longer-form content enabling more complex narratives while subscription-based distribution creates different audience relationships than TikTok’s discovery-oriented algorithm (Burgess and Green, 2018). Instagram’s visual-centric design and influencer culture foster forms of esthetic self-presentation distinct from both TikTok’s virality-oriented and YouTube’s subscription-based models (Duffy and Hund, 2015).
These platform-specific patterns have profound implications for journalism. Journalists have evolved into “digital curators” managing various information sources and AI tools (Pavlik, 2023), requiring navigation of distinct representational norms and engagement patterns across multiple platforms simultaneously. This multi-platform environment creates opportunities and challenges for journalistic practice, as content must be adapted to platform-specific affordances while maintaining editorial integrity and consistency. The political economic pressures differ substantially across platforms: optimizing for TikTok engagement may require emotional expressiveness, challenging traditional gender norms, while television content must satisfy both ratings pressures and regulatory constraints favoring conventional representations. Journalists navigating these divergent pressures face strategic choices about how to maintain professional standards while adapting to platform-specific logics (van Dijck et al., 2018).
Integrating artificial intelligence into journalistic workflows raises fundamental questions about algorithmic bias, cultural sensitivity, and news epistemology. As AI evolves from a mere tool to an information provider and processor, traditional journalistic definitions of verification, objectivity, and credibility require revision (Husnain et al., 2024). Current AI systems deployed in newsrooms typically rely on training data drawn predominantly from Western, English-language sources, creating potential for systematic bias against non-Western cultural contexts and gender norms. This Western-centrism in AI development reflects broader patterns of technological colonialism where platforms developed in Silicon Valley contexts embed assumptions about appropriate content, user behavior, and cultural values that may not translate across diverse global contexts (Couldry and Mejias, 2019).
Research on Indonesian hijabi entrepreneurs reveals that mainstream AI tools often misinterpret or marginalize Islamic content (Putri and Sonni, 2023), suggesting similar biases may affect journalistic AI applications. When natural language processing systems trained primarily on secular Western texts encounter content integrating religious identity with professional messaging, algorithms may fail to accurately assess sentiment, relevance, or quality, potentially leading to systematic underrepresentation of religiously inflected perspectives in algorithmically curated news environments. Munir (2025) Analysis of religious bias in AI datasets demonstrates that mainstream systems often associate Islamic terminology with negative sentiment or extremism, creating systematic disadvantages for content authentically integrating religious identity, precisely the content that empirical evidence shows receives higher engagement in Muslim-majority contexts.
Developing culturally sensitive AI systems for journalism requires not only technical refinement but also fundamental reconceptualization of appropriate training data and evaluation metrics. The finding that posts containing Islamic references receive 23% higher engagement rates contradicts assumptions embedded in many Western-developed AI systems that treat religious content as less professional or less broadly appealing. Journalistic AI systems built on such assumptions would systematically misunderstand audience preferences in Muslim-majority contexts, potentially leading to content recommendation and production strategies misaligned with actual reader interests. This misalignment has concrete consequences: algorithms systematically downranking religiously inflected content may inadvertently marginalize perspectives resonating strongly with local audiences, while amplifying secularized content that actually receives lower engagement (Noble, 2018).
The challenge extends beyond religious and cultural dimensions to encompass gender representation more broadly. Machine learning systems trained on historical news corpora inevitably encode gender biases present in those texts, potentially perpetuating outdated representational norms even as cultural attitudes evolve. The rapid shift in gender representation patterns suggests that static training data may quickly become obsolete, requiring continuous retraining and evaluation to ensure algorithmic systems remain aligned with evolving social norms. Moreover, suppose algorithms are trained on historical corpora emphasizing traditional masculine representations. In that case, they may systematically recommend against emotionally expressive content even as audience preferences shift toward such representations, creating algorithmic conservatism that slows cultural evolution (Eubanks, 2017).
The opacity of many commercial AI systems deployed in journalism poses additional challenges. The risk of reducing nuance and context, key elements in compelling journalistic storytelling (Calvo-Rubio and Rojas-Torrijos, 2024), becomes particularly problematic when algorithms operate as black boxes. This opacity prevents meaningful evaluation of whether systems reinforce problematic stereotypes, enable diverse expression, or produce unanticipated outcomes diverging from both traditional biases and contemporary values. Gillespie (2014) Analysis of algorithmic accountability highlights how platform opacity strategically serves commercial interests by preventing competitors from reverse-engineering recommendation systems. Still, this same opacity prevents journalists, researchers, and the public from evaluating whether algorithms serve democratic values or perpetuate harmful biases.
The question of human oversight becomes particularly acute regarding gender representation. Only 33% of people think journalists ‘always’ or ‘often’ check AI outputs before publishing (Al-Zoubi et al., 2024), suggesting substantial public skepticism about the adequacy of human review processes. Effective oversight requires not only technical literacy regarding AI function but also critical awareness of gender representation issues and cultural sensitivity, a demanding combination potentially not uniformly distributed across newsroom staff. Developing such oversight capacity requires substantial investment in journalism education addressing algorithmic literacy, gender representation analysis, and cross-cultural communication resources that may not be available in newsrooms facing economic pressures (Pavlik, 2023).
5 Discussion: research gaps and future directions
Despite growing scholarly attention to digital transformation in journalism, significant gaps remain in understanding how algorithmic systems shape gender representation across cultural contexts. Three primary limitations constrain theoretical development and practical application.
First, most studies examining AI in journalism focus on Western media contexts, with insufficient attention to how algorithmic systems function in non-Western cultural environments. The findings from Indonesian contexts demonstrate that patterns observed in Western settings do not necessarily generalize to other cultural contexts. When religious identity enters the equation, the branding process becomes more nuanced, requiring careful balance between authentic self-expression and market accessibility. This dynamic suggests that cross-cultural research examining how algorithmic systems interact with diverse gender norms and religious identities is essential for developing globally applicable theoretical frameworks. Comparative studies across Muslim-majority nations (Indonesia, Malaysia, Pakistan, Turkey, Saudi Arabia), Asian contexts with different religious compositions (Thailand, Philippines, Japan), African contexts (Nigeria, South Africa, Egypt), and Latin American contexts (Brazil, Mexico, Argentina) would enable identification of which AGRP mechanisms represent universal algorithmic dynamics versus culturally specific patterns (Iwabuchi, 2002; Kraidy, 2005).
Second, longitudinal research tracking how gender representation patterns evolve over extended periods remains scarce. The five-year analysis reveals dramatic shifts within a relatively compressed timeframe, but questions remain about whether these patterns represent permanent transformations or transitional states. Future research employing longer time horizons could illuminate whether observed diversification of gender representation stabilizes, continues evolving, or potentially reverses under changing technological, economic, or cultural conditions. Additionally, cohort-based studies could clarify whether strong preferences for diverse gender expression among younger audiences represent generational differences persisting as cohorts age or life-stage effects diminishing over time. Longitudinal research could also examine whether algorithmic acceleration of cultural change creates backlash dynamics where rapid transformation provokes conservative reactions seeking to restore traditional norms and patterns, potentially visible in some Indonesian political discourse around gender and morality (Brenner, 2011).
Third, methodological approaches for examining algorithmic influence on gender representation remain underdeveloped. While valuable, traditional content analysis methods struggle to capture the algorithmic content curation’s dynamic, feedback-driven nature. The interaction between content characteristics, engagement metrics, algorithmic recommendations, and subsequent content production creates complex causal patterns resisting straightforward analysis. Understanding how these predictive systems shape content creation requires methodologies tracing feedback loops across multiple temporal scales and platforms. Needed methodological innovations include computational approaches enabling large-scale analysis of recommendation patterns, experimental designs manipulating algorithmic parameters to identify causal effects, ethnographic research examining how journalists respond to algorithmic feedback, and mixed-methods approaches integrating quantitative pattern identification with qualitative understanding of meaning-making processes (Vicari and Kirby, 2022).
Fourth, existing research has insufficiently examined how platform business models and regulatory contexts shape algorithmic gender representation. While technical analyses of algorithms proliferate, political economic analysis examining how commercial imperatives and regulatory constraints influence what algorithms optimize for remains underdeveloped. Future research should systematically compare gender representation patterns across platforms with different business models (advertiser-funded vs. subscription-based vs. state-supported) and regulatory environments (light-touch vs. comprehensive content regulation) to clarify how economic and institutional structures shape algorithmic outcomes (Srnicek, 2017; van Dijck et al., 2018).
Several specific research directions emerge as particularly promising. Comparative studies examining how similar content performs across different algorithmic environments could isolate platform-specific effects from content-inherent characteristics. Participatory research approaches involving journalists, content creators, and platform users could provide crucial insights into subjective experiences of algorithmic gender representation. While quantitative engagement metrics reveal important patterns, they cannot capture how individuals interpret and respond to gender representations encountered in algorithmically curated news feeds. Such participatory approaches could employ cultural probes, diary studies, and co-design workshops, enabling diverse stakeholders to articulate experiences that quantitative metrics miss (Sanders and Stappers, 2008).
Experimental studies manipulating specific algorithmic parameters could provide causal evidence regarding how different recommendation strategies affect gender representation outcomes. While ethical and practical constraints limit feasibility in operational news environments, simulated environments or platform partnerships could yield valuable insights. Platform partnerships enabling A/B testing of different algorithmic approaches, for instance, comparing engagement-optimized versus diversity-weighted recommendation systems, could provide causal evidence currently lacking in observational research. However, such partnerships raise questions about researcher independence and commercial influence (Zuboff, 2019). Interdisciplinary research integrating perspectives from communication studies, computer science, gender studies, and area studies could develop more comprehensive theoretical frameworks addressing the inherently multidimensional nature of these phenomena.
Research examining how journalists and newsroom organizations understand and respond to algorithmic influences on gender representation could inform professional practice and education. The success of digital transformation depends on developing holistic approaches that consider technological, ethical, economic, and social aspects (Sonni, 2025). Studies examining how journalism education addresses these challenges could identify gaps in current training and inform curriculum development. Journalism education research could determine whether curricula adequately address algorithmic literacy, gender representation analysis, platform political economy, and cross-cultural communication and how to integrate these topics given already-crowded curricula and limited resources (Pavlik, 2023).
Policy-relevant research examining how regulatory frameworks might address problematic aspects of algorithmic gender representation while preserving beneficial dimensions merits attention. The rapid pace of technological change often outpaces regulatory adaptation, creating potential for harmful patterns to become entrenched before governance mechanisms emerge. Research clarifying the scope and nature of algorithmic influence on gender representation could inform evidence-based policy development, balancing algorithmic bias concerns with media freedom and journalistic independence principles. Such policy research should examine diverse regulatory approaches across national contexts: European Union content regulation frameworks, Indonesian broadcasting oversight models, and American light-touch approaches, assessing their relative effectiveness at promoting inclusive representation while preserving press freedom and platform innovation (Gillespie, 2018).
The theoretical and practical implications extend beyond individual entrepreneur strategies to encompass broader questions about digital inclusion, algorithmic fairness, and cultural representation in contemporary business and media environments. For journalism practice, findings suggest the necessity of developing platform-specific content strategies, implementing robust AI system oversight, and cultivating technical and critical literacies regarding algorithmic influence. A successful media organization will create synergies between human and artificial intelligence while upholding roles as guardians of democracy and healthy public discourse. This requires institutional investments in algorithmic literacy training, development of ethical guidelines for AI usage in newsrooms, creation of editorial workflows incorporating algorithmic awareness, and cultivation of newsroom cultures valuing both technological innovation and journalistic integrity (Diakopoulos, 2019).
Understanding algorithmic influence on gender representation requires interdisciplinary collaboration and methodological innovation. The evidence suggests these systems possess substantial influence, yet their specific nature and societal implications remain incompletely understood. Continued research, critical attention, and thoughtful governance are essential for ensuring that algorithmic transformation serves democratic values and promotes fair, accurate, and culturally sensitive representation across diverse global contexts. Moving forward, scholarship should resist both technological determinism (algorithms simply imposing effects on passive cultures) and technological solutionism (better algorithms will automatically solve representation problems), instead examining the complex co-constitution of technological systems, economic structures, cultural values, and human agency in producing gender representation outcomes (Plantin et al., 2016; Couldry and Mejias, 2019; Figure 1).
Figure 1. Platform-Specific Gender Representation Patterns and Temporal Evolution in Indonesian Digital Journalism. (A) Distribution of gender representation types across platforms in 2024, showing differential patterns drive by platform affordances and business models. (B) Five-year temporal analysis demonstrating a rapid shift from traditional (85 to 30%) to diverse representations. Data derived from systematic analysis of 240 h of television programming and 1,100 digital media items across the Indonesian media ecosystem. Traditional = competitive achievement/leadership focus; Emotional = interpersonal connection/vulnerability; Creative = innovative expression/alternative lifestyles. Platform-specific variations reflect the interaction of algorithmic recommendation systems, user demographics, business model optimization targets (engagement vs. regulatory compliance), and cultural appropriation patterns, illustrating AGRP’s core theoretical claims about the co-constitution of technology, economy, and culture in shaping gender representation.
6 Concluding remarks
The digital transformation of journalism extends beyond technological change to encompass fundamental alterations in how media systems construct and transform gender norms. The proposed Algorithmic Gender Representation Paradigm offers a theoretical framework synthesizing platform study, feminist technology scholarship, glocalization theory, and political economy analysis to emphasize co-constitutive relationships between technical systems, cultural values, and human agency. This framework moves beyond simplistic technological determinism or social constructivism to recognize that algorithmic systems exert independent influence on representational outcomes while remaining embedded in broader cultural and economic contexts.
The documented rapid shift in gender representation patterns, with traditional masculine representations declining 55 percentage points over 5 years, suggests that algorithmic systems may accelerate cultural transformation at unprecedented rates. This finding carries profound implications for understanding the media’s role in social change, indicating that the transition to algorithmically curated environments may fundamentally alter the temporal dynamics of cultural transformation. However, whether this acceleration represents permanent transformation, transitional instability, or localized Indonesian patterns remains an open empirical question requiring longitudinal and comparative research.
Critical to understanding these patterns is recognition that algorithmic effects operate through interaction with platform business models and regulatory contexts. The divergence between television’s predominantly traditional representations (65%) and TikTok’s emphasis on emotional (42%) and creative (45%) expressions reflects not only technical affordances but also economic logics: advertiser-funded platforms optimizing for engagement versus broadcast media navigating regulatory oversight. This political economic dimension has been undertheorized in existing scholarship but proves essential for explaining observed patterns (Srnicek, 2017; van Dijck et al., 2018).
However, significant research gaps remain. Current literature lacks sufficient cross-cultural research, longitudinal studies tracking long-term patterns, political-economic analysis of business model influences, and methodologies adequate for examining complex feedback loops between algorithms, audiences, and content creators. Addressing these gaps requires interdisciplinary collaboration integrating communication studies, computer science, gender studies, political economy, and area studies. Comparative research across diverse national contexts, regulatory environments, and cultural settings is essential for determining which aspects of AGRP represent universal dynamics versus culturally or institutionally specific patterns. Such research should resist Western-centric assumptions while avoiding relativism that would deny any cross-cultural patterns, instead examining how global platform architectures interact with local contexts to produce hybrid outcomes (Kraidy, 2005; Iwabuchi, 2002).
The framework presented here emerges from Indonesian empirical observations. It proposes mechanisms for understanding algorithmic gender representation in contexts where global platforms intersect with non-Western cultural values and diverse media institutional structures. We position AGRP as a theoretical contribution grounded in substantial empirical evidence while acknowledging that claiming universal applicability requires systematic validation across diverse contexts. The Indonesian case offers valuable theoretical insights given the nation’s scale, digital economy development, and position at the intersection of Islamic values and technological innovation. However, these same characteristics mean findings may not directly translate to other contexts without adaptation.
For journalism practice, the findings suggest the need for developing platform-specific content strategies, understanding how different business models shape algorithmic behavior, implementing robust AI system oversight, and cultivating both critical and technical literacies regarding algorithmic influence. As journalism continues its digital transformation, understanding how algorithmic systems, platform business models, and regulatory contexts shape gender representation will only grow in significance. Continued research, critical engagement with platform companies, and thoughtful governance are essential for ensuring that algorithmic transformation serves democratic values and promotes fair, accurate, and culturally sensitive representation across diverse global contexts.
Ultimately, this perspective argues that algorithmic gender representation should be understood neither as technological determinism (platforms simply imposing effects) nor as cultural autonomy (local cultures freely appropriating technologies) but rather as ongoing negotiation between global platform logics, local cultural contexts, business model imperatives, and regulatory frameworks. This negotiation produces hybrid outcomes that require theoretically sophisticated, empirically grounded, and culturally sensitive analysis. The AGRP framework offers analytical tools for such analysis while acknowledging substantial work remains to validate, refine, and potentially adapt these tools across the diverse contexts of global digital journalism.
Data availability statement
The original contributions presented in the study are included in the article/supplementary material, further inquiries can be directed to the corresponding author.
Author contributions
AFS: Formal analysis, Writing – original draft, Methodology, Software, Data curation, Writing – review & editing, Conceptualization.
Funding
The author(s) declared that financial support was not received for this work and/or its publication.
Acknowledgments
The author thanks colleagues at the Department of Communication Studies, Hasanuddin University, for valuable discussions contributing to this perspective.
Conflict of interest
The author(s) declared that this work was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.
Generative AI statement
The author(s) declared that Generative AI was not used in the creation of this manuscript.
Any alternative text (alt text) provided alongside figures in this article has been generated by Frontiers with the support of artificial intelligence and reasonable efforts have been made to ensure accuracy, including review by the authors wherever possible. If you identify any issues, please contact us.
Publisher’s note
All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.
References
Abidin, C. (2021). Mapping internet celebrity on TikTok: exploring attention economies and visibility labours. Cult. Sci. J. 12, 77–103. doi: 10.5334/csci.140
Al-Zoubi, O., Ahmad, N., and Hamid, N. A. (2024). Artificial intelligence in newsrooms: ethical challenges facing journalists. Stud. Media Communication 12:401. doi: 10.11114/smc.v12i1.6587
Amponsah, P. N., and Atianashie, A. M. (2024). Navigating the new frontier: a comprehensive review of AI in journalism. Advan. Journalism Communication 12, 1–17. doi: 10.4236/ajc.2024.121001
Benjamin, R. (2020). Race after technology: Abolitionist tools for the new Jim code. Cambridge, UK; Medford, MA: Polity.
Brenner, S. (2011). Private moralities in the public sphere: democratization, Islam, and gender in Indonesia. Am. Anthropol. 113, 478–490. doi: 10.1111/j.1548-1433.2010.01355.x
Burgess, J., and Green, J. (2018) in Youtube: Online video and participatory culture. ed. Digital media and society. Second ed (Cambridge, UK; Medford, MA: Polity Press).
Calvo-Rubio, L.-M., and Rojas-Torrijos, J.-L. (2024). Criteria for journalistic quality in the use of artificial intelligence. Commun. Soc. 37, 247–259. doi: 10.15581/003.37.2.247-259
Couldry, Nick, and Mejias, Ulises Ali. (2019). The costs of connection: how data is colonizing human life and appropriating it for capitalism. Culture Econ. Life. Stanford, California: Stanford University Press.
de-Lima-Santos, M.-F., and Ceron, W. (2021). Artificial intelligence in news media: current perceptions and future outlook. J. Journalism Media 3, 13–26. doi: 10.3390/journalmedia3010002
Diakopoulos, N. (2019). Automating the news: How algorithms are rewriting the media. London: Harvard University Press.
Duffy, B. E., and Hund, E. (2015). “Having it all” on social media: entrepreneurial femininity and self-branding among fashion bloggers. Soc. Media Soc. 1, 1–11. doi: 10.1177/2056305115604337
Eubanks, V. (2017). Automating inequality: How high-tech tools profile, police, and punish the poor. First Edn. New York, NY: St. Martin's Press.
Gillespie, T. (2010). The politics of ‘platforms’. New Media Soc. 12, 347–364. doi: 10.1177/1461444809342738,
Gillespie, T. (2014). “The relevance of algorithms” in Media technologies: Essays on communication, materiality, and society. eds. T. Gillespie, P. J. Boczkowski, and K. A. Foot (The MIT Press).
Gillespie, T. (2018). Custodians of the internet: Platforms, content moderation, and the hidden decisions that shape social media. New Haven: Yale University Press.
Husnain, M., Imran, A., and Tareen, H. K. (2024). Artificial intelligence in journalism: examining prospectus and obstacles for students in the domain of media. J. Asian Dev. Stud. 13, 614–625. doi: 10.62345/jads.2024.13.1.51
Iwabuchi, K. (2002). Recentering globalization: Popular culture and Japanese transnationalism. Durham: Duke University Press.
Kraidy, M. M. (2005). Hybridity, or the cultural logic of globalization. Philadelphia: Temple University Press.
Lim, M. (2012). The league of thirteen: Media concentration in Indonesia. Arizona, UnitedStates: Participatory Media Lab Arizona State University Tempe and The Ford Foundation.
Lotz, A. D. (2017). Portals: A treatise on internet-distributed television. Ann Arbor, Mich: Maize Books, an imprint of Michigan Publishing.
Munir, B. (2025). Islamophobic artificial intelligence in the USA: a critical analysis of religious bias in datasets. Law Libr. J., 1–38. doi: 10.2139/ssrn.5265355
Newman, N. (2024). Journalism, media, and technology trends and predictions 2024. Oxford: Reuters Institute for the Study of Journalism.
Noble, S. U. (2018). Algorithms of oppression: how search engines reinforce racism. New York: NYU Press.
Pavlik, J. V. (2023). Collaborating with ChatGPT: considering the implications of generative artificial intelligence for journalism and media education. Journalism Mass Commun. Educ. 78, 84–93. doi: 10.1177/10776958221149577
Plantin, J.-C., Lagoze, C., Edwards, P. N., and Sandvig, C. (2016). Infrastructure studies meet platform studies in the age of Google and Facebook. New Media Soc. 20, 293–310. doi: 10.1177/1461444816661553
Putri, V. C. C., and Sonni, A. F. (2023). Ketika cadar jadi alat persuasi: strategi beauty influencer di era digital. Makassar: Unhas Press.
Robertson, R. (1995). “Glocalization: time-space and homogeneity-heterogeneity” in Global Modernities. eds. M. Featherstone, S. Lash, and R. Robertson (London: SAGE), 25–44.
Sanders, E. B. N., and Stappers, P. J. (2008). Co-creation and the new landscapes of design. CoDesign 4, 5–18. doi: 10.1080/15710880701875068,
Sonni, A. F. (2025). Digital transformation in journalism: mini review on the impact of AI on journalistic practices. Front. Commun. 10, 1–4. doi: 10.3389/fcomm.2025.1535156
Sonni, A. F., Putri, V. C. C., Akbar, M., and Irwanto, I. (2025). Platform-specific masculinities: the evolution of gender representation in Indonesian reality shows across television and digital media. J. Med. 6, 1–17. doi: 10.3390/journalmedia6010038,
Srnicek, N. (2017) in Platform capitalism. eds. C. Nelson and L. Grossberg, vol. 588. 4th ed (Champaign, Illinois: University of Illinois Press).
van Dijck, J., Poell, T., and de Waal, M. (2018). The platform society: Public values in a connective world. Oxford: Oxford University Press.
Vicari, S., and Kirby, D. (2022). Digital platforms as socio-cultural artifacts: developing digital methods for cultural research. Inf. Commun. Soc. 26, 1733–1755. doi: 10.1080/1369118x.2022.2027498,
Wajcman, J. (2009). Feminist theories of technology. Camb. J. Econ. 34, 143–152. doi: 10.1093/cje/ben057
Keywords: digital journalism, algorithmic representation, gender studies, platform studies, masculinity, artificial intelligence, Indonesian media, cultural transformation
Citation: Sonni AF (2025) Algorithmic gender representation in digital journalism: a perspective on platform-mediated masculinities in Indonesian media. Front. Hum. Dyn. 7:1735924. doi: 10.3389/fhumd.2025.1735924
Edited by:
Phivos Mylonas, University of West Attica, GreeceReviewed by:
Zoltán Rajki, Pázmány Péter Catholic University, HungaryCopyright © 2025 Sonni. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
*Correspondence: Alem Febri Sonni, YWxlbWZlYnJpc0B1bmhhcy5hYy5pZA==