#sociotechnical
Explore tagged Tumblr posts
magictavern ¡ 5 months ago
Text
going to pound my head against a desk repeatedly
9 notes ¡ View notes
transmutationisms ¡ 4 months ago
Note
Given your last answer, do you think we live under censorship and is that even a useful term? We don't have free speech obviously
correct but most people wildly misunderstand how censorship usually functions because they think of it solely as violating a pure negative right to self-expression (freedom-from direct government intervention) and not as violating a positive right to self-expression without economic consequence (freedom-to exist in a home with enough to eat etc). most censorship is mediated by the employer–employee relationship (or, for children / elderly / disabled people, what is really equivalent, the parent–child or caretaker–dependant relationship). it's comparatively way more likely you will face the choice of "stay silent or lose your access to financial solvency" than the direct threat of prison or whatever -- the latter of course does happen, but the former is just the routine daily fabric of existing in a dotb. i don't think it's useful to call this anything other than censorship because i think censorship has always been enacted via the economic structures of the society it's taking place in. it's not some ideological overlay -- it happens at the behest of, and using the sociotechnical infrastructure of, the actual material state, which is to say the economic system of bourgeois rule
63 notes ¡ View notes
quasi-normalcy ¡ 1 year ago
Text
So here's something that someone said to me a few days ago which seems kind of obvious once you've heard it, but which merits mention nonetheless: In the past, before the world wars, inventions were made primarily by individual inventors or small groups of inventors looking to apply them to actual human concerns; now, new technologies come about largely through industrial r & d divisions, and were made specifically to boost shareholder value. The present seems bewildering and the future unknowable and kind of horrifying because huge sociotechnical transformations like AI and IoT are being driven not by actual human concerns or desires, but purely by the interests of capital. Human agency is no longer in the driver's seat, and we are going somewhere that most of us very much do not want to go. It's the difference between the late-Victorian inventor-hero and cyberpunk.
Anyways, we need to seize the means of invention.
320 notes ¡ View notes
probablyasocialecologist ¡ 1 year ago
Text
As historian Paul Edwards argues, climate science and meteorology were the first fields that collected and processed global data in near real time. To make even a banal statement like ‘The world has warmed by 1°C since 1850’, scientists need to compile data from a vast global network of ground observation sites, weather balloons, research vessels, and satellites, then feed that data into enormous physical and mathematical models to form a coherent world picture. Edwards calls climate science a ‘vast machine’ in that it is ���a sociotechnical system that collects data, models physical processes, tests theories, and ultimately generates a widely shared understanding of climate and climate change.’
Troy Vettese, Drew Pendergrass, Half-Earth Socialism: A Plan to Save the Future from Extinction, Climate Change and Pandemics
224 notes ¡ View notes
mariacallous ¡ 10 months ago
Text
At the 2023 Defcon hacker conference in Las Vegas, prominent AI tech companies partnered with algorithmic integrity and transparency groups to sic thousands of attendees on generative AI platforms and find weaknesses in these critical systems. This “red-teaming” exercise, which also had support from the US government, took a step in opening these increasingly influential yet opaque systems to scrutiny. Now, the ethical AI and algorithmic assessment nonprofit Humane Intelligence is taking this model one step further. On Wednesday, the group announced a call for participation with the US National Institute of Standards and Technology, inviting any US resident to participate in the qualifying round of a nationwide red-teaming effort to evaluate AI office productivity software.
The qualifier will take place online and is open to both developers and anyone in the general public as part of NIST's AI challenges, known as Assessing Risks and Impacts of AI, or ARIA. Participants who pass through the qualifying round will take part in an in-person red-teaming event at the end of October at the Conference on Applied Machine Learning in Information Security (CAMLIS) in Virginia. The goal is to expand capabilities for conducting rigorous testing of the security, resilience, and ethics of generative AI technologies.
“The average person utilizing one of these models doesn’t really have the ability to determine whether or not the model is fit for purpose,” says Theo Skeadas, chief of staff at Humane Intelligence. “So we want to democratize the ability to conduct evaluations and make sure everyone using these models can assess for themselves whether or not the model is meeting their needs.”
The final event at CAMLIS will split the participants into a red team trying to attack the AI systems and a blue team working on defense. Participants will use the AI 600-1 profile, part of NIST's AI risk management framework, as a rubric for measuring whether the red team is able to produce outcomes that violate the systems' expected behavior.
“NIST's ARIA is drawing on structured user feedback to understand real-world applications of AI models,” says Humane Intelligence founder Rumman Chowdhury, who is also a contractor in NIST's Office of Emerging Technologies and a member of the US Department of Homeland Security AI safety and security board. “The ARIA team is mostly experts on sociotechnical test and evaluation, and [is] using that background as a way of evolving the field toward rigorous scientific evaluation of generative AI.”
Chowdhury and Skeadas say the NIST partnership is just one of a series of AI red team collaborations that Humane Intelligence will announce in the coming weeks with US government agencies, international governments, and NGOs. The effort aims to make it much more common for the companies and organizations that develop what are now black-box algorithms to offer transparency and accountability through mechanisms like “bias bounty challenges,” where individuals can be rewarded for finding problems and inequities in AI models.
“The community should be broader than programmers,” Skeadas says. “Policymakers, journalists, civil society, and nontechnical people should all be involved in the process of testing and evaluating of these systems. And we need to make sure that less represented groups like individuals who speak minority languages or are from nonmajority cultures and perspectives are able to participate in this process.”
81 notes ¡ View notes
nitrosplicer ¡ 1 year ago
Text
“On the other hand, my heartbeat speeds up slightly as I near the end of the line, because I know that I’m almost certainly about to experience an embarrassing, uncomfortable, and perhaps humiliating search by a Transportation Security Administration (TSA) officer, after my body is flagged as anomalous by the millimeter wave scanner. I know that this is almost certainly about to happen because of the particular sociotechnical configuration of gender normativity (cis-normativity, or the assumption that all people have a gender identity that is consistent with the sex they were assigned at birth) that has been built into the scanner, through the combination of user interface (UI) design, scanning technology, binary-gendered body-shape data constructs, and risk detection algorithms, as well as the socialization, training, and experience of the TSA agents.
..
I glance to the left, where a screen displays an abstracted outline of a human body. As I expected, bright fluorescent yellow pixels on the flat-panel display highlight my groin area (see figure 0.1). You see, when I entered the scanner, the TSA operator on the other side was prompted by the UI to select Male or Female; the button for Male is blue, the button for Female is pink. Since my gender presentation is nonbinary femme, usually the operator selects Female. However, the three-dimensional contours of my body, at millimeter resolution, differ from the statistical norm of female bodies as understood by the data set and risk algorithm designed by the manufacturer of the millimeter wave scanner (and its subcontractors), and as trained by a small army of clickworkers tasked with labeling and classification (as scholars Lilly Irani, Nick Dyer-Witheford, Mary Gray, and Siddharth Suri, among others, remind us). If the agent selects Male, my breasts are large enough, statistically speaking, in comparison to the normative male body-shape construct in the database, to trigger an anomaly warning and a highlight around my chest area. If they select Female, my groin area deviates enough from the statistical female norm to trigger the risk alert. In other words, I can’t win. This sociotechnical system is sure to mark me as “risky,” and that will trigger an escalation to the next level in the TSA security protocol.”
- Introduction: #TravelingWhileTrans, Design Justice, and Escape from the Matrix of Domination by Sasha Costanza-Chock
8 notes ¡ View notes
azspot ¡ 1 year ago
Quote
By all means, go after big tech. Regulate advertising. Create data privacy laws. Hold tech accountable for its failure to be interoperable. But for the love of the next generation, don’t pretend that it’s going to help vulnerable youth. And when the problem is sociotechnical in nature, don’t expect corporations to be able to solve it.
danah boyd
10 notes ¡ View notes
protoslacker ¡ 1 year ago
Text
Of course, it’s not just sociotechnical systems that are degrading. So too is our collective social fabric. And, with it, the mental health of young people. Last month, Crisis Text Line published some of its latest data about depression and suicide alongside what CTL is hearing from young people about what they need to thrive. (Hint: banning technology is not their top priority.) Young people are literally dying due to a lack of opportunities for social connection. This should break your heart. Teens are feeling isolated and alone. (My research consistently showed that this is why they turn to technology in the first place.) It’s also scary to see the lack of access to community resources. Communities are degrading. And there’s no quick technical fix. 
danah boyd at apophenia. Degradation, Legitimacy Threats, and Isolation
New research on census, youth, mental health; a recent talk and an upcoming one
The sentence: "This should break your hearts." when talking about young people is so important to me. Something so central to being human too often is utterly neglected.
The Fireside Chat where Tressie McMillan Cottom, Janet Vertesi, and danah boyd on tech & society issues is very good.
3 notes ¡ View notes
raffaellopalandri ¡ 14 days ago
Text
Reclaiming Slowness: The Temporal Politics of Contemplation in an Accelerated World
We live in a world governed not merely by the rapidity of movement but by the tyranny of acceleration itself. Photo by Song Kaiyue on Pexels.com Modern temporality, shaped by capitalist production, algorithmic modulation, and infrastructural compression, no longer unfolds—it is extracted, quantified, and weaponised. We are immersed in a sociotechnical assemblage that confuses speed with…
Tumblr media
View On WordPress
1 note ¡ View note
geeknik ¡ 20 days ago
Text
Antigua’s AI plan: sunshine laws but digital sieve. Hackers’ playground, ethics theater. #OpenToInvasion
0 notes
strategictech ¡ 1 month ago
Text
A Systematic Literature Review on System Dynamics
System Dynamics is a methodology for modeling and simulating dynamically complex systems, which are characterized by feedback loops, nonlinearity, and time delays. It uses Causal Loop Diagrams and Stock-and-Flow Diagrams to represent system structures and to analyze how these structures influence system behavior over time. The majority of reviews on System Dynamics use cases either focus on a single thematic domain or are not up to date. As a result, a comprehensive and up-to-date overview of System Dynamics use cases across all thematic domains is lacking. Furthermore, no Systematic Literature Review has yet examined whether and to what extent artificial intelligence has been employed to extend and enhance the System Dynamics approach in identified use cases. To address this research gap, we conducted a Systematic Literature Review. The identified studies are classified within a sociotechnical system framework consisting of User-Machine Interaction, Value Chain-related Domains, and Social Factors. Additionally, we examine the role of artificial intelligence in advancing the System Dynamics approach within the identified use cases. We identified 41 relevant publications covering 19 thematic domains. Most studies fall within the Value Chain-related Domains system, followed by Social Factors and User-Machine Interaction. The integration of artificial intelligence into System Dynamics use cases remains rare. Our findings highlight the broad applicability of System Dynamics across various thematic domains while simultaneously revealing a gap in the complementary use of artificial intelligence.
@tonyshan #techinnovation https://bit.ly/tonyshan https://bit.ly/tonyshan_X
0 notes
xaltius ¡ 2 months ago
Text
The top Data Engineering trends to look for in 2025
Tumblr media
Data engineering is the unsung hero of our data-driven world. It's the critical discipline that builds and maintains the robust infrastructure enabling organizations to collect, store, process, and analyze vast amounts of data. As we navigate mid-2025, this foundational field is evolving at an unprecedented pace, driven by the exponential growth of data, the insatiable demand for real-time insights, and the transformative power of AI.
Staying ahead of these shifts is no longer optional; it's essential for data engineers and the organizations they support. Let's dive into the key data engineering trends that are defining the landscape in 2025.
1. The Dominance of the Data Lakehouse
What it is: The data lakehouse architecture continues its strong upward trajectory, aiming to unify the best features of data lakes (flexible, low-cost storage for raw, diverse data types) and data warehouses (structured data management, ACID transactions, and robust governance). Why it's significant: It offers a single platform for various analytics workloads, from BI and reporting to AI and machine learning, reducing data silos, complexity, and redundancy. Open table formats like Apache Iceberg, Delta Lake, and Hudi are pivotal in enabling lakehouse capabilities. Impact: Greater data accessibility, improved data quality and reliability for analytics, simplified data architecture, and cost efficiencies. Key Technologies: Databricks, Snowflake, Amazon S3, Azure Data Lake Storage, Apache Spark, and open table formats.
2. AI-Powered Data Engineering (Including Generative AI)
What it is: Artificial intelligence, and increasingly Generative AI, are becoming integral to data engineering itself. This involves using AI/ML to automate and optimize various data engineering tasks. Why it's significant: AI can significantly boost efficiency, reduce manual effort, improve data quality, and even help generate code for data pipelines or transformations. Impact: * Automated Data Integration & Transformation: AI tools can now automate aspects of data mapping, cleansing, and pipeline optimization. * Intelligent Data Quality & Anomaly Detection: ML algorithms can proactively identify and flag data quality issues or anomalies in pipelines. * Optimized Pipeline Performance: AI can help in tuning and optimizing the performance of data workflows. * Generative AI for Code & Documentation: LLMs are being used to assist in writing SQL queries, Python scripts for ETL, and auto-generating documentation. Key Technologies: AI-driven ETL/ELT tools, MLOps frameworks integrated with DataOps, platforms with built-in AI capabilities (e.g., Databricks AI Functions, AWS DMS with GenAI).
3. Real-Time Data Processing & Streaming Analytics as the Norm
What it is: The demand for immediate insights and actions based on live data streams continues to grow. Batch processing is no longer sufficient for many use cases. Why it's significant: Businesses across industries like e-commerce, finance, IoT, and logistics require real-time capabilities for fraud detection, personalized recommendations, operational monitoring, and instant decision-making. Impact: A shift towards streaming architectures, event-driven data pipelines, and tools that can handle high-throughput, low-latency data. Key Technologies: Apache Kafka, Apache Flink, Apache Spark Streaming, Apache Pulsar, cloud-native streaming services (e.g., Amazon Kinesis, Google Cloud Dataflow, Azure Stream Analytics), and real-time analytical databases.
4. The Rise of Data Mesh & Data Fabric Architectures
What it is: * Data Mesh: A decentralized sociotechnical approach that emphasizes domain-oriented data ownership, treating data as a product, self-serve data infrastructure, and federated computational governance. * Data Fabric: An architectural approach that automates data integration and delivery across disparate data sources, often using metadata and AI to provide a unified view and access to data regardless of where it resides. Why it's significant: Traditional centralized data architectures struggle with the scale and complexity of modern data. These approaches offer greater agility, scalability, and empower domain teams. Impact: Improved data accessibility and discoverability, faster time-to-insight for domain teams, reduced bottlenecks for central data teams, and better alignment of data with business domains. Key Technologies: Data catalogs, data virtualization tools, API-based data access, and platforms supporting decentralized data management.
5. Enhanced Focus on Data Observability & Governance
What it is: * Data Observability: Going beyond traditional monitoring to provide deep visibility into the health and state of data and data pipelines. It involves tracking data lineage, quality, freshness, schema changes, and distribution. * Data Governance by Design: Integrating robust data governance, security, and compliance practices directly into the data lifecycle and infrastructure from the outset, rather than as an afterthought. Why it's significant: As data volumes and complexity grow, ensuring data quality, reliability, and compliance (e.g., GDPR, CCPA) becomes paramount for building trust and making sound decisions. Regulatory landscapes, like the EU AI Act, are also making strong governance non-negotiable. Impact: Improved data trust and reliability, faster incident resolution, better compliance, and more secure data handling. Key Technologies: AI-powered data observability platforms, data cataloging tools with governance features, automated data quality frameworks, and tools supporting data lineage.
6. Maturation of DataOps and MLOps Practices
What it is: * DataOps: Applying Agile and DevOps principles (automation, collaboration, continuous integration/continuous delivery - CI/CD) to the entire data analytics lifecycle, from data ingestion to insight delivery. * MLOps: Extending DevOps principles specifically to the machine learning lifecycle, focusing on streamlining model development, deployment, monitoring, and retraining. Why it's significant: These practices are crucial for improving the speed, quality, reliability, and efficiency of data and machine learning pipelines. Impact: Faster delivery of data products and ML models, improved data quality, enhanced collaboration between data engineers, data scientists, and IT operations, and more reliable production systems. Key Technologies: Workflow orchestration tools (e.g., Apache Airflow, Kestra), CI/CD tools (e.g., Jenkins, GitLab CI), version control systems (Git), containerization (Docker, Kubernetes), and MLOps platforms (e.g., MLflow, Kubeflow, SageMaker, Azure ML).
The Cross-Cutting Theme: Cloud-Native and Cost Optimization
Underpinning many of these trends is the continued dominance of cloud-native data engineering. Cloud platforms (AWS, Azure, GCP) provide the scalable, flexible, and managed services that are essential for modern data infrastructure. Coupled with this is an increasing focus on cloud cost optimization (FinOps for data), as organizations strive to manage and reduce the expenses associated with large-scale data processing and storage in the cloud.
The Evolving Role of the Data Engineer
These trends are reshaping the role of the data engineer. Beyond building pipelines, data engineers in 2025 are increasingly becoming architects of more intelligent, automated, and governed data systems. Skills in AI/ML, cloud platforms, real-time processing, and distributed architectures are becoming even more crucial.
Global Relevance, Local Impact
These global data engineering trends are particularly critical for rapidly developing digital economies. In countries like India, where the data explosion is immense and the drive for digital transformation is strong, adopting these advanced data engineering practices is key to harnessing data for innovation, improving operational efficiency, and building competitive advantages on a global scale.
Conclusion: Building the Future, One Pipeline at a Time
The field of data engineering is more dynamic and critical than ever. The trends of 2025 point towards more automated, real-time, governed, and AI-augmented data infrastructures. For data engineering professionals and the organizations they serve, embracing these changes means not just keeping pace, but actively shaping the future of how data powers our world.
1 note ¡ View note
literaturereviewhelp ¡ 3 months ago
Text
Compare the impact of Sociotechnical systems on the emergence of two technologiesone developed before 1920 and one after - using two topics from the course”. Introduction Technology has brought in a new face of interaction between people who exude different behaviors. In many cases, the use of technology has led to multiple and unexpected outcomes. However, it is worthwhile to recognize that technology is entrenched in a multifaceted set of procedures, people, the environment, and other technologies. The interactions of all these facets make up the socio-technical systems. Sociotechnical systems are approaches that relates to organizational work design with complexities that takes cognizance of people and technology at their work places in the context of organizational development. It simply describes the manner in which the infrastructural complexities in the society interact with the human behavior. In this regard, the society and many of its substructures become socio-technical systems with many complexities. Regardless of several socio-technical systems, this paper compares Facebook and telegraph. Facebook and telegraph Telegraph is a system of communication that receives and transmits simple but unmodulated impulses in electric form. In many cases, a wire connects the reception station to a transmission station. Although the major operations in telegraphy started in the 1940s, it was actually introduced at the beginning of 20th century. The beginning of the use of telegraphy can be traced back to 1917 when short congratulatory messages were sent to people celebrating their one-hundredth birthday. The Royal Mail’s Inland Telegraph Service provided the first service in telegraph (Metcalfe, 1992, p. 5). On its part, Facebook was founded in the year 2004 by mark Zuckerberg while still a student at Harvard University. To date, Facebook has since emerged as the leading site with close to 900 million users. Apparently, both Facebook and telegraph are sources of information systems where people and organizations interact and share information. Comparisons between Facebook and telegraph Telegraph requires the operation of a human medium while Facebook is an individual operating system. Apart from the rights administrators, the user without any external intervention can individually operate Facebook. For the efficient use of telegraphy, there must be actual employees stationed at the transmission and reception stations to manage the process of communication. A telegraphy operator can have a direct and personal conversation without knowing the identity of the person. As examples of information systems, both Facebook and telegraph offer socio-technical systems platform where participants develop, organize and manage information and its contents (Grint & Willocks, 1995, p. 54). This makes these two systems fit for organizational management. They both satisfy the information needs of organizations through the production, dissemination and controlling of knowledge. However, while telegraph relies on paper work to convey information, Facebook requires an internet empowered gadget to transmit information. Such gadgets might include computers, mobile technologies, Read the full article
0 notes
jcmarchi ¡ 7 months ago
Text
3 Questions: Community policing in the Global South
New Post has been published on https://thedigitalinsider.com/3-questions-community-policing-in-the-global-south/
3 Questions: Community policing in the Global South
Tumblr media Tumblr media
The concept of community policing gained wide acclaim in the U.S. when crime dropped drastically during the 1990s. In Chicago, Boston, and elsewhere, police departments established programs to build more local relationships, to better enhance community security. But how well does community policing work in other places? A new multicountry experiment co-led by MIT political scientist Fotini Christia found, perhaps surprisingly, that the policy had no impact in several countries across the Global South, from Africa to South America and Asia.
The results are detailed in a new edited volume, “Crime, Insecurity, and Community Policing: Experiments on Building Trust,” published this week by Cambridge University Press. The editors are Christia, the Ford International Professor of the Social Sciences in MIT’s Department of Political Science, director of the MIT Institute for Data, Systems, and Society, and director of the MIT Sociotechnical Systems Research Center; Graeme Blair of the University of California at Los Angeles; and Jeremy M. Weinstein of Stanford University. MIT News talked to Christia about the project.
Q: What is community policing, and how and where did you study it?
A: The general idea is that community policing, actually connecting the police and the community they are serving in direct ways, is very effective. Many of us have celebrated community policing, and we typically think of the 1990s Chicago and Boston experiences, where community policing was implemented and seen as wildly successful in reducing crime rates, gang violence, and homicide. This model has been broadly exported across the world, even though we don’t have much evidence that it works in contexts that have different resource capacities and institutional footprints.
Our study aims to understand if the hype around community policing is justified by measuring the effects of such policies globally, through field experiments, in six different settings in the Global South. In the same way that MIT’s J-PAL develops field experiments about an array of development interventions, we created programs, in cooperation with local governments, about policing. We studied if it works and how, across very diverse settings, including Uganda and Liberia in Africa, Colombia and Brazil in Latin America, and the Philippines and Pakistan in Asia.
The study, and book, is the result of collaborations with many police agencies. We also highlight how one can work with the police to understand and refine police practices and think very intentionally about all the ethical considerations around such collaborations. The researchers designed the interventions alongside six teams of academics who conducted the experiments, so the book also reflects an interesting experiment in how to put together a collaboration like this.
Q: What did you find?
A: What was fascinating was that we found that locally designed community policing interventions did not generate greater trust or cooperation between citizens and the police, and did not reduce crime in the six regions of the Global South where we carried out our research.
We looked at an array of different measures to evaluate the impact, such as changes in crime victimization, perceptions of police, as well as crime reporting, among others, and did not see any reductions in crime, whether measured in administrative data or in victimization surveys.
The null effects were not driven by concerns of police noncompliance with the intervention, crime displacement, or any heterogeneity in effects across sites, including individual experiences with the police.
Sometimes there is a bias against publishing so-called null results. But because we could show that it wasn’t due to methodological concerns, and because we were able to explain how such changes in resource-constrained environments would have to be preceded by structural reforms, the finding has been received as particularly compelling.
Q: Why did community policing not have an impact in these countries?
A: We felt that it was important to analyze why it doesn’t work. In the book, we highlight three challenges. One involves capacity issues: This is the developing world, and there are low-resource issues to begin with, in terms of the programs police can implement.
The second challenge is the principal-agent problem, the fact that the incentives of the police may not align in this case. For example, a station commander and supervisors may not appreciate the importance of adopting community policing, and line officers might not comply. Agency problems within the police are complex when it comes to mechanisms of accountability, and this may undermine the effectiveness of community policing.
A third challenge we highlight is the fact that, to the communities they serve, the police might not seem separate from the actual government. So, it may not be clear if police are seen as independent institutions acting in the best interest of the citizens.
We faced a lot of pushback when we were first presenting our results. The potential benefits of community policing is a story that resonates with many of us; it’s a narrative suggesting that connecting the police to a community has a significant and substantively positive effect. But the outcome didn’t come as a surprise to people from the Global South. They felt the lack of resources, and potential problems about autonomy and nonalignment, were real. 
0 notes
operationalinsights ¡ 8 months ago
Text
The Evolution of Organizational Development: A Historical Perspective and Contemporary Update
Organizational Development (OD) has emerged as a dynamic and multifaceted field that has significantly shaped the trajectory of modern organizations. Rooted in the human relations movement of the mid-20th century, OD has evolved to address the complex challenges and opportunities presented by the rapidly changing global business landscape. This essay delves into the historical development of OD, tracing its roots from its origins to its contemporary applications. It explores the key milestones, theoretical underpinnings, and practical interventions that have defined the field. Furthermore, it examines the contemporary updates and emerging trends that are shaping the future of OD.
Historical Development of OD
The Human Relations Movement (1950s-1960s)
The seeds of OD were sown during the Human Relations Movement, a period marked by a shift from a mechanistic view of organizations to a more humanistic perspective. Pioneers such as Elton Mayo, Kurt Lewin, and Douglas McGregor challenged the traditional, hierarchical approach to management and emphasized the importance of human factors in organizational effectiveness. Key concepts that emerged from this era include:
Hawthorne Studies: These groundbreaking studies highlighted the impact of social and psychological factors on worker productivity, demonstrating that employees are not merely motivated by economic incentives.
Group Dynamics: Lewin's work on group dynamics underscored the significance of group processes and interpersonal relationships in shaping organizational behavior.
Theory X and Theory Y: McGregor's contrasting theories offered two different views of human nature, with Theory X assuming that employees are inherently lazy and require close supervision, and Theory Y suggesting that employees are motivated and capable of self-direction.
Organizational Behavior (1970s-1980s)
Building upon the foundations laid by the Human Relations Movement, the field of Organizational Behavior emerged in the 1970s and 1980s. This period witnessed a surge of research and theoretical development, focusing on understanding individual and group behavior within organizational settings. Key contributions during this time include:
Contingency Theory: This theory proposed that there is no one-size-fits-all approach to organizational design and management. Instead, the most effective approach depends on various contextual factors, such as organizational size, industry, and culture.
Systems Theory: This perspective views organizations as complex systems composed of interrelated parts that influence one another. It emphasizes the importance of understanding the whole system rather than focusing on individual components.
Sociotechnical Systems Theory: This theory highlights the interdependence of social and technical systems within organizations. It suggests that optimal organizational performance requires a balance between these two elements.
Strategic Planning (1990s-2000s)
In the 1990s and 2000s, OD expanded its focus to align with strategic planning and organizational performance. This period saw the integration of OD interventions with strategic initiatives, aiming to enhance organizational effectiveness and competitiveness. Key developments during this time include:
Strategic OD: This approach involves using OD interventions to support the implementation of strategic plans and achieve organizational goals.
Mergers and Acquisitions: OD played a crucial role in managing change and integrating diverse organizational cultures during mergers and acquisitions.
Total Quality Management (TQM): OD contributed to the implementation of TQM initiatives, which focused on continuous improvement and customer satisfaction.
The Digital Age (2010s-present)
The advent of digital technologies has profoundly transformed organizations, necessitating a new wave of OD interventions. Contemporary OD addresses the challenges and opportunities presented by the digital age, including:
Digital Transformation: OD supports organizations in navigating the complexities of digital transformation, including adopting new technologies, redefining business models, and fostering a digital culture.
Remote Work and Virtual Teams: OD helps organizations manage remote work arrangements, build virtual teams, and maintain effective communication and collaboration.
Artificial Intelligence and Automation: OD addresses the ethical implications and organizational impact of AI and automation, including workforce reskilling and job redesign.
Cybersecurity: OD plays a role in enhancing cybersecurity awareness and building organizational resilience against cyber threats.
Contemporary Updates
Agile and Adaptive Organizations
In today's rapidly changing business environment, organizations need to be agile and adaptive to thrive. OD supports this by promoting:
Agile Methodologies: Adopting agile principles and practices to foster flexibility, collaboration, and rapid response to market changes.
Continuous Learning: Encouraging a culture of continuous learning and development to keep pace with technological advancements and emerging trends.
Experimentation and Innovation: Creating a safe space for experimentation and innovation, fostering a mindset of risk-taking and creative problem-solving.
Diversity, Equity, and Inclusion
Diversity, equity, and inclusion (DEI) have become critical priorities for organizations. OD contributes to DEI efforts by:
Unconscious Bias Training: Raising awareness of unconscious biases and their impact on organizational decision-making and employee experiences.
Inclusive Leadership Development: Developing leaders who can create inclusive work environments and empower diverse teams.
Employee Resource Groups (ERGs): Supporting ERGs to foster a sense of belonging and provide networking opportunities for employees from diverse backgrounds.
Digital Transformation and Technology Integration
OD plays a vital role in helping organizations leverage technology to drive innovation and improve performance. This includes:
Digital Workplace Design: Creating digital workspaces that enhance employee productivity and collaboration.
Data Analytics and Insights: Using data analytics to inform decision-making and identify opportunities for improvement.
Change Management: Supporting the adoption of new technologies and processes through effective change management strategies.
Sustainability and Social Responsibility
Organizations are increasingly expected to be socially responsible and environmentally sustainable. OD contributes to these efforts by:
Sustainability Initiatives: Supporting the development and implementation of sustainability initiatives, such as reducing carbon footprint and promoting ethical sourcing.
Social Impact Measurement: Developing metrics to measure the social impact of organizational activities.
Corporate Social Responsibility (CSR): Integrating CSR into the core business strategy and aligning it with organizational values.
Data-Driven Decision Making
Data-driven decision-making has become essential for organizations to make informed choices and optimize performance. OD supports this by:
Data Literacy: Enhancing the data literacy of employees to enable them to interpret data and draw meaningful insights.
Data-Driven Culture: Fostering a culture of data-driven decision-making, where data is used to inform strategy, operations, and innovation.
Data Ethics: Ensuring that data is collected, stored, and used ethically and responsibly.
Conclusion
Organizational Development has evolved significantly over the past seven decades, adapting to the changing needs and challenges of organizations. From its early focus on human relations to its contemporary emphasis on digital transformation, sustainability, and DEI, OD continues to be a vital discipline for driving organizational success. As the business landscape continues to evolve, OD will remain a critical tool for organizations to navigate complexity, foster innovation, and create sustainable value. By understanding the historical development of OD and its contemporary updates, organizations can leverage its power to build a brighter future.
0 notes
edtechnews ¡ 8 months ago
Text
Keeping AI real
Gen AI represents a sociotechnical revolution with massive implications for our lives. Keeping humans in the loop is critical for its responsible development.
EDTECH@UTRGV's insight:
"When you have such a powerful technology, which is going to impact business, society, and our planet in ways that we’d not thought possible before, one of the key questions that comes to mind is, 'Do we understand this technology?'”
0 notes