#differences between cost estimating and cost control
Explore tagged Tumblr posts
asestimationsconsultants · 4 months ago
Text
Cost Estimating Service vs. Cost Control | Key Differences Explained
Cost estimating and cost control are two essential concepts in project management, both aimed at ensuring a project stays within its budget. However, while they share a common goal of managing project finances, they are distinct processes with different objectives and techniques. Understanding the key differences between cost estimating and cost control is crucial for project managers, as each process contributes uniquely to a project’s financial success. In this article, we will explore the differences between cost estimating services and cost control, their roles in project management, and how they work together to ensure a project’s financial health.
What is Cost Estimating?
Cost estimating is the process of predicting the financial resources required to complete a project. This process involves calculating the cost of materials, labor, equipment, and other necessary resources based on the project’s scope and requirements. A cost estimating service is typically engaged during the initial stages of a project to provide a detailed budget estimate that guides the entire project’s financial planning.
Cost estimating involves analyzing a variety of factors to provide an accurate prediction of how much the project will cost. These factors can include historical data from similar projects, current market rates for materials and labor, and the complexity of the project. The goal of cost estimating is to produce a reliable estimate that reflects the true cost of completing the project from start to finish.
Cost estimating services often use specialized software, data analytics, and expert knowledge to produce accurate estimates. The result is a comprehensive cost breakdown that serves as a financial blueprint for the project. This estimate helps businesses determine the feasibility of the project, secure funding, and set expectations for both clients and stakeholders.
What is Cost Control?
Cost control, on the other hand, is the process of managing and monitoring project costs throughout its lifecycle. While cost estimating provides a forecast of the costs, cost control ensures that the actual costs do not exceed the estimate. Cost control involves tracking project expenses, comparing them to the initial budget, and making adjustments as needed to keep the project within its financial parameters.
Cost control is a continuous process that occurs throughout the project’s execution. It involves monitoring costs on a regular basis, identifying any discrepancies between the actual expenses and the estimated costs, and taking corrective actions if necessary. This can include finding ways to reduce costs, reallocating resources, or negotiating with suppliers to get better rates. The aim of cost control is to prevent cost overruns and ensure that the project is completed within the allocated budget.
One key aspect of cost control is the use of performance measurement tools, such as earned value management (EVM), to track project progress and costs. These tools help project managers assess whether the project is on track in terms of both time and budget. If the project is at risk of going over budget, cost control measures can be implemented to mitigate the situation and bring costs back in line with the original estimate.
Key Differences Between Cost Estimating and Cost Control
While both cost estimating and cost control are integral to managing project finances, they differ significantly in their objectives, timing, and methods.
Objective: The primary goal of cost estimating is to predict the costs of a project and provide an accurate budget. Cost estimating focuses on determining how much the project will cost, based on available data, market conditions, and project scope. In contrast, cost control focuses on ensuring that the project stays within the approved budget by monitoring actual expenses and making adjustments as needed.
Timing: Cost estimating occurs during the planning phase of a project, before the project begins. This is when the cost estimate is developed, and it serves as the foundation for the project’s financial planning. Cost control, on the other hand, takes place throughout the project’s execution phase. It begins once the project starts and continues until the project is completed, ensuring that expenses remain within the approved budget.
Methods and Techniques: Cost estimating relies on a variety of techniques to predict costs, including historical data analysis, expert judgment, and industry standards. Cost estimating services may use specialized software to calculate and present detailed estimates that account for materials, labor, and other costs. The process also involves risk analysis to identify potential cost fluctuations and uncertainties that may affect the budget.
Cost control, on the other hand, involves actively tracking and monitoring costs during the project. Techniques used in cost control include regular cost reporting, variance analysis, and performance measurement tools. Cost control professionals use these techniques to identify cost discrepancies and address issues before they lead to significant budget overruns.
Role in Project Management: Cost estimating is crucial for the initial planning and budgeting of a project. Without an accurate estimate, it’s difficult to determine if a project is financially viable, secure funding, or establish realistic expectations for clients and stakeholders. Cost control is essential for ensuring that the project stays within its financial parameters once it’s underway. It helps ensure that resources are used efficiently and that any issues that arise can be addressed promptly to prevent costly delays.
How Cost Estimating and Cost Control Work Together
Although cost estimating and cost control are distinct processes, they are interconnected and work together to ensure that a project is completed on time and within budget. Cost estimating provides the foundation for cost control. The initial estimate serves as the baseline for tracking and controlling costs during the project. By comparing actual costs to the estimate, project managers can identify areas where adjustments are needed and make data-driven decisions to keep the project on track.
For example, if cost control reveals that a particular aspect of the project is exceeding its budget, the project manager can revisit the original cost estimate to determine if the estimate was accurate or if unforeseen factors have contributed to the overrun. This feedback loop allows for continuous improvement in both cost estimation and cost control processes, helping ensure that future projects are even more accurate and well-managed.
Conclusion
Cost estimating and cost control are two essential components of effective project management. While cost estimating focuses on predicting the costs of a project, cost control ensures that the project stays within the budget. These two processes, although distinct, work hand in hand to manage a project’s financial resources. By understanding the differences and how they complement each other, businesses can better plan, execute, and control projects, ultimately leading to greater financial success and project completion within budget.
0 notes
drnikolatesla · 2 months ago
Text
Tesla’s Wardenclyffe Tower: Built on Sound Math, Undone by Cost and Misunderstanding
Tumblr media
Let’s set the record straight—Nikola Tesla’s Wardenclyffe Tower was a high-voltage experimental transmission system grounded in quarter-wave resonance and electrostatic conduction—not Hertzian radiation. And the math behind it? It was solid—just often misunderstood by people applying the wrong physics.
In May 1901, Tesla calculated that to set the Earth into electrical resonance, he needed a quarter-wavelength system with a total conductor length of about 225,000 cm, or 738 feet.
So Tesla’s tower design had to evolve during construction. In a letter dated September 13, 1901, to architect Stanford White, Tesla wrote: “We cannot build that tower as outlined.” He scaled the visible height down to 200 feet. The final structure—based on photographic evidence and Tesla’s own testimony—stood at approximately 187 feet above ground. To meet the required electrical length, Tesla engineered a system that combined spiral coil geometry, an elevated terminal, a 120-foot vertical shaft extending underground, and radial pipes buried outward for approximately 300 feet. This subterranean network, together with the 187-foot tower and carefully tuned inductance, formed a continuous resonant conductor that matched Tesla’s target of 738 feet. He described this strategy in his 1897 patent (No. 593,138) and expanded on it in his 1900 and 1914 patents, showing how to simulate a longer conductor using high-frequency, resonant components. Even with a reduced visible height, Tesla’s system achieved quarter-wave resonance by completing the rest underground—proving that the tower’s electrical length, not its physical height, was what really mattered.
Tesla calculated his voltages to be around 10 million statvolts (roughly 3.3 billion volts in modern SI), so he had to consider corona discharge and dielectric breakdown. That’s why the terminal was designed with large, smooth spherical surfaces—to minimize electric surface density and reduce energy loss. This was no afterthought; it’s a core feature of his 1914 patent and clearly illustrated in his design sketches.
Now, about that ±16 volt swing across the Earth—what was Tesla talking about?
He modeled the Earth as a conductive sphere with a known electrostatic capacity. Using the relation:
ε × P = C × p
Where:
ε is the terminal’s capacitance (estimated at 1,000 cm)
P is the applied voltage (10⁷ statvolts)
C is the Earth’s capacitance, which Tesla estimated at 5.724 × 10⁸ cm (based on the Earth’s size)
p is the resulting voltage swing across the Earth
Plugging in the numbers gives p ≈ 17.5 volts, which Tesla rounded to ±16 volts. That’s a theoretical 32-volt peak-to-peak swing globally—not a trivial claim, but one rooted in his framework.
Modern recalculations, based on updated geophysical models, suggest a smaller swing—closer to ±7 volts—using a revised Earth capacitance of about 7.1 × 10⁸ cm. But that’s not a knock on Tesla’s math. His original ±16V estimate was fully consistent with the cgs system and the best data available in 1901, where the Earth was treated as a uniformly conductive sphere.
The difference between 7 and 16 volts isn’t about wrong numbers—it’s about evolving assumptions. Tesla wrote the equation. Others just adjusted the inputs. His premise—that the Earth could be set into controlled electrical resonance—still stands. Even if the voltage swing changes. The vision didn’t.
Wouldn't that ±16V swing affect nature or people? Not directly. It wasn’t a shock or discharge—it was a global oscillation in Earth’s electric potential, spread evenly across vast distances. The voltage gradient would be tiny at any given point—far less than what’s generated by everyday static electricity. Unless something was specifically tuned to resonate with Tesla’s system, the swing had no noticeable effect on people, animals, or the environment. It was a theoretical signature of resonance, not a hazard. While some early experiments in Colorado Springs did produce disruptive effects—like sparks from metal objects or spooked horses—those involved untuned, high-voltage discharges during Tesla’s exploratory phase. Wardenclyffe, by contrast, was a refined and carefully grounded system, engineered specifically to minimize leakage, discharge, and unintended effects.
And Tesla wasn’t trying to blast raw power through the ground. He described the system as one that would “ring the Earth like a bell,” using sharp, high-voltage impulses at a resonant frequency to create standing waves. As he put it:
“The secondary circuit increases the amplitude only... the actual power is only that supplied by the primary.” —Tesla, Oct. 15, 1901
Receivers, tuned to the same frequency, could tap into the Earth’s oscillating potential—not by intercepting radiated energy, but by coupling to the Earth’s own motion. That ±16V swing wasn’t a bug—it was the signature of resonance. Tesla’s transmitter generated it by pumping high-frequency, high-voltage impulses into the Earth, causing the surface potential to oscillate globally. That swing wasn’t the energy itself—it acted like a resonant “carrier.” Once the Earth was ringing at the right frequency, Tesla could send sharp impulses through it almost instantly, and tuned receivers could extract energy.
So—was it feasible?
According to Tesla’s own patents and 1916 legal testimony, yes. He accounted for insulation, voltage gradients, tuning, and corona losses. His design didn’t rely on brute force, but on resonant rise and impulse excitation. Tesla even addressed concerns over losses in the Earth—his system treated the planet not as a passive resistor but as an active component of the circuit, capable of sustaining standing waves.
Wardenclyffe wasn’t a failure of science. It was a casualty of cost, politics, and misunderstanding. Tesla’s system wasn’t just about wireless power—it was about turning the entire planet into a resonant electrical system. His use of electrostatics, high-frequency resonance, and spherical terminals was decades ahead of its time—and still worth studying today.
“The present is theirs; the future, for which I really worked, is mine.” —Nikola Tesla
81 notes · View notes
Text
Tumblr media
Summary that was posted on social media:
1) Here it is, our overview of the most interesting ME/CFS studies of 2024. If you think we’re missing an important one, feel free to post it as a comment below
https://mecfsskeptic.com/2024-looking-back-on-a-year-of-me-cfs-research
2) A recent preprint used data from the UK Biobank and showed that there are many differences in the blood between ME/CFS patients and controls. Their analysis suggests that these differences are not due to inactivity or deconditioning.
3) The Intramural NIH study did the most extensive set of biological measurements ever conducted in ME/CFS patients but because of the low sample size (n = 17) and focus on ‘effort preference’ it has mostly led to disappointment.
4) The MCAM study recruited patients from 7 ME/CFS specialty clinics and assessed cognitive functioning in more than 200 patients. Accuracy was relatively normal but information processing speed was significantly lower in patients versus controls.
5) This year, the largest study on repeated cardiopulmonary exercise testing in ME/CFS was published. Although there were small to moderate effects, there was a large overlap between patients and controls. It is unclear if this procedure can be used as a diagnostic test.
6) 2024 also saw two big rehabilitation trials for children with ME/CFS: MAGENTA and FITNET-NHS. Both had null results suggesting that GET and online CBT (FITNET) are unlikely to be cost-effective.
7) We also had a new prevalence estimate using statistics data for NHS Hospitals in England. Extrapolating the highest rates to the entire UK would mean that 390.000 people (0.585%) get ME/CFS in their lifetime.
A Norwegian study showed how wages dropped dramatically and sickness benefits increased before and after an ME/CFS diagnosis.
9) Honourable mention: following the tragic death of Maeve Boothby O’Neill, Prof. Emeritus Jonathan Edwards wrote a paper on managing nutritional failure in people with severe ME/CFS, including suggestions that can supplement the NICE Guideline.
====== comment: Another excellent blog from this account. Note that the social media summary only highlights a few of the many studies that are discussed in the post. Saying that, I didn’t find it overly long either. Highly recommended.
45 notes · View notes
covid-safer-hotties · 9 months ago
Text
Also preserved on our archive
By - Nicola Davis
Immunologists push for increase in testing and more widespread vaccine booster rollout as new variant, XEC, emerges
Covid is on the rise in England, and experts have warned that more must be done to prevent and control infections after a “capitulation to the virus”.
Prof Danny Altmann, an immunologist at Imperial College London, said those working in the field were perplexed by the current attitude to the battle against Covid, as the latest figures showed an increase in hospital admissions.
The latest data for England from the UK Health Security Agency (UKHSA) showed that hospital admissions increased to 3.71 per 100,000 population for the week between 16 and 22 September 2024, compared with 2.56 per 100,000 the previous week.
The percentage of people with symptoms who have tested positive for Covid, based on tests at sentinel “spotter” laboratories, has also risen in the last week to 11.8% compared with 9.1% in the previous week.
Altmann described the prevailing stance on the virus as a “capitulation”. “To those who work in this field, the current attitude of acceptance to losing this war of attrition against Covid is puzzling and a little desperate,” he said.
“The data, both in the UK and US, show that the current Omicron subvariants are hugely successful at punching through any dwindling population immunity, so that we tolerate huge prevalence of around 12%. Our capitulation to the virus is a combination of a population where most are now many months or years from their last vaccine dose, and that vaccine dose was in any case poorly cross-protective for the very distinct current variants.
“Clearly, there is behavioural polarisation between those who are worried by this and look for mitigation, and those who think we must learn live with it and paid too high a price for our earlier measures,” he said.
Dr Simon Williams, from Swansea University, added that surveys suggest there is also a large group of people who are not thinking much about Covid at all. “Part of this is psychological – for two to three years it was something people had to think about all the time and is something that for many had many negative memories and feelings attached to it,” he said.
While Altmann said debate around measures needed to be properly informed and data-driven and to avoid extreme stances, it was important not to trivialise the impact of the virus.
“Those at the weaker end of the immune response spectrum may often experience four or more breakthrough infections per year. These may range from mild to those needing several days of work, with all the associated economic costs, plus any additional NHS burden,” he said.
Altmann also stressed the impact of long Covid, noting that it is thought to affect around 400 million people globally – with 3% lost workforce and a global cost estimate of $1tn annually – and can arise even in vaccinated people following reinfection.
The latest Covid data comes as a new variant is expected to become prevalent in the coming months. Known as XEC, it was first identified in Germany over the summer, and cases have already been identified in the UK. It is thought to have emerged from two other Covid variants, themselves descended from the BA.2.86 variant.
However, experts have said that, at present, XEC is not thought to cause different symptoms from previous variants and does not appear to be fuelling a surge in cases. It is also expected that Covid vaccinations and past infections will continue to offer protection against severe disease.
While bookings for the NHS autumn Covid booster jabs opened this week, Altmann said they should be offered more widely, together with increased use of lateral flow testing to avoid the spread of Covid.
Williams added that it was strange that more had not been done to clean indoor air and improve ventilation in public spaces including schools.
But while he backed offering boosters more widely, he also raised concerns: “I worry that again this autumn we will see a relatively low uptake of the booster among priority groups, including younger adults with a compromised immune system.”
14 notes · View notes
beardedmrbean · 7 months ago
Text
As he vowed to do on X, Rep. Kevin Kiley, R-Roseville, on Wednesday announced he is introducing legislation to eliminate federal funding for the California High-Speed Rail Authority, which seeks to build a high-speed rail line from Los Angeles to San Francisco.
The line would start with the much more modest route of Merced to Bakersfield.
The project has already spent around $6.8 billion in federal dollars and is seeking an additional $8 billion from the U.S. government. Kiley’s bill would make the California high-speed rail project ineligible for any further federal funding.
Kiley is working in tandem with billionaires Elon Musk and Vivek Ramaswamy, both of whom were tapped by President-elect Donald Trump to co-chair a Department of Government Efficiency.
The DOGE, as they are calling it, isn’t really a department, and any recommendations they make would have to be approved by both Trump and the GOP-controlled Congress, but it does have a goal to eliminate vast swaths of federal spending and bureaucracy.
In a statement Wednesday, Kiley called the California high-speed rail project a failure due to “political ineptitude,” and added that “there is no plausible scenario where the cost to federal or state taxpayers can be justified.”
“Our share of federal transportation funding should go towards real infrastructure needs, such as improving roads that rank among the worst in the country,” he said.
Kiley sits on the House Transportation and Infrastructure Committee, which is controlled by Republicans who long have been critical of the project. Should Kiley’s bill pass the House, it would also need to be approved by the Senate, where Republicans hold a 53-seat majority, and be signed by Trump.
That’s likely to be difficult, though, because it would probably take 60 votes to limit debate and both of California’s U.S. senators are Democrats.
The Bee has reached out to the California High-Speed Rail Authority for comment.
In a previous statement to The Bee, authority spokesperson Toni Tinoco said that the agency “continues to make significant progress.”
Tinoco pointed out that the project has been environmentally cleared for all but 31 miles between L.A. and San Francisco, making the project “shovel-ready for future phases of investment.”
Tinoco also pushed back against statements by Kiley and the DOGE about how the project has been beset by both delays (it was initially estimated to be complete by 2020) and massive cost overruns (ballooning from an initial cost estimate of $45 billion to as much as $127.9 billion).
“It did not account for inflation or any unknown scope, a lesson learned as our estimates now account for inflation and project scope, helping explain cost difference,” Tinoco wrote.
7 notes · View notes
Text
New Garage Door Pricing in Plano, Texas
Plano Garage Door Pricing Breakdown: Understanding Your Investment
Roll-up Commercial Garage Door Installation in Plano, TX
Choosing the right garage door goes beyond just its appearance—it's a valuable investment in your home's security, energy efficiency, and curb appeal. For residents of Plano, TX, knowing the key pricing factors is essential in making an informed and budget-conscious decision. This guide provides an in-depth look at what affects the cost of your garage door installation, helping you plan your project with confidence.
Understanding the Full Investment
When planning your garage door budget, it’s important to factor in all costs, not just the door itself. The price can range from a few hundred dollars for basic models to thousands for custom options. Key elements such as materials, size, insulation, and additional features all influence the final price.
Factors that Impact Garage Door Costs
Material Selection
Different materials come with varying price points and benefits, according to Home Guide:
Steel: Durable and low-maintenance, steel doors cost between $800 and $2,500.
Aluminum: Lightweight and rust-resistant, aluminum doors range from $700 to $2,000.
Wood: Natural and traditional, but high-maintenance, with prices from $1,200 to $4,000.
Fiberglass: Lightweight, weather-resistant, and wood-like in appearance, priced from $800 to $1,800.
Vinyl: Resilient and low-maintenance, costing $1,000 to $2,500. 2. Size and Configuration
Door dimensions directly affect both materials and labor:
Single Doors: Typically 8-10 feet wide, ranging from $500 to $2,500.
Double Doors: Wider, usually 12–16 feet, with prices from $800 to $4,500.
Custom Sizes: Custom designs add 15–30% to the overall cost. 3. Insulation Options
Insulated doors boost energy efficiency, especially in extreme climates:
Non-Insulated: Ideal for mild climates, starting around $500.
Single-Layer Insulation: Moderate temperature control, priced between $800 and $2,000.
Double-Layer/Foam-Injected Insulation: Premium energy efficiency, costing $1,500 to $4,000. 4. Design and Style Your design choice influences both aesthetic and cost:
Traditional Raised Panel: Classic look, priced between $600 and $1,800.
Carriage House Style: Rustic appeal, typically between $1,200 and $3,500.
Modern Glass & Aluminum: Sleek, contemporary designs range from $2,000 to $5,000.
Custom Designs: Unique and fully personalized, ranging from $3,000 to over $10,000. 5. Added Features and Accessories
Enhancements can elevate both functionality and security:
Security Features: Smart locks ($200–$500) and reinforced frames ($100–$500) enhance security.
Aesthetic Upgrades: Windows ($100–$500) and decorative hardware ($50–$300) add style.
Smart Technology: Wi-Fi-enabled openers ($150–$300) and keypads ($50–$150) offer convenience.
Installation and Maintenance Costs Professional Installation Labor costs in Plano range from $200 to $600, depending on door complexity. Additional work like door removal or electrical upgrades can increase costs by $50–$200 and $50–$100 per hour, respectively.
Maintenance & Repair Regular maintenance helps extend your door’s lifespan:
Routine Maintenance: Professional check-ups cost around $75–$150 annually.
Maintenance Plans: Yearly plans ($100 to $200) include regular upkeep and repair discounts.
Repair Costs:
Minor Repairs: Fixing rollers, hinges, or weather stripping costs $50 to $150.
Major Repairs: Replacing springs or adjusting tracks can range from $200 to $400.
Get a Custom Quote
To get an accurate estimate, contact a trusted garage door professional in Plano who can assess your needs and offer a tailored quote. By balancing upfront costs with long-term savings, you’ll make a smart investment in your home’s security and style. If you're interested in having garage doors installed in Plano, Texas, or nearby areas, contact us today. We understand that you want to read reviews to know how reliable and trustworthy our services are. Read reviews, learn more about us, and contact us through our MapQuest and BBB accounts. Follow us on our social media accounts and listings:
Our HotFrog Account (to read reviews about us and our services)
Our Manta Account (to read reviews about us)
Our MySpace Account (our portfolio)
Address to our company (Apple)
Address to our company (Bing)
3 notes · View notes
niqhtlord01 · 2 years ago
Text
Humans are weird: No sense in Dying
( Don’t forget to come see my on my new patreon and support me for early access to stories and personal story requests :D https://www.patreon.com/NiqhtLord )    
Military Report 3759612.3 Subject: Harvest Conflict Category: Mendigold Incident
Star date: 3751.2 Coalition forces have driven off remaining Reni Fleet contingents and have full control over the Mendigold system.
Consisting of four worlds, the system only has Mendigold Prime had a livable environment while the remaining three have minimal mining and colonization facilities.
Mendigold Prime was designated a military garrison world to the Reni empire. As such the installations on the planet were largely dedicated to the Reni military industrial complex. City sized barracks, underground ammunition warehouses, dozens of football sized landing fields, and uncountable training grounds for various aspects of the Reni military were the primary structures on the planet. Only a small percentage of the Reni population was designated civilian and they were regulated to either industrial support roles or service industries that entertain off duty soldiers.
On average the planet’s population was between 500 million - 1 billion, with 93% of that being military personnel. This number was subject to fluctuation due to military units rotating in and out for different warzones. When the Coalition arrived in system the population was estimated 3 Billion as the Reni military had been preparing for a renewed offensive into Coalition territory and had diverted the majority of their military ground forces gathered on planet. Star date 3751.3 The Coalition naval fleets have surrounded Mendigold Prime and a blockade has been enforced.
Orbital defenses around the main military bases keep the naval assets from conducting recon scans from lower orbit, but their firing range is limited allowing for the rest of the planet to be mapped without issue.
Scans show that the planet is heavily fortified with existing prefabricated structures while additional defenses had been constructed. These consisted of extensive trench works, bunkers, and newly built gun positions that surrounded each installation for at least three kilometers in every direction.
Further reinforcing the defense of the planet was a large and well equipped air force. The scans showed the existence of several underground bunker complexes that housed the aircraft and protected them from all but a sustained orbital strike. Due to the lack of resistance in the space around the planet it was believed that they were limited to atmospheric aircraft, but it was not discounted that they possessed some fighters that could breach the atmosphere and attack the Coalition navy.
Routine scans were continued while Coalition leadership met to debate plans for the invasion of the planet.
Star Date 3751.6 Coalition leaders were unable to decide on how the ground invasion of Mendigold would progress.
All calculations predicted that any planetary landing would cost Coalition forces 15%. Assaulting any of the heavily defended installations estimated 35% per installation and a total planet conquest at 87% casualty rate.
Unlike previous worlds the Reni had attacked they had only a few months to entrench themselves before the Coalition could respond, but Mendigold was the first Reni military world the coalition had come into contact with and now faced the might of a well disciplined and prepared enemy.
Such casualty figures were well beyond what even the most bloodthirsty Coalition leader take pause. These kinds of losses would cripple the rest of the campaign and leave any future planetary assaults all but impossible.
The debate about what should be done went on for two days while the coalition navy maintained its blockade of the planet. By the third day the leader of the human contingent spoke up with a rather brutal method.
Rather than invading the human leader, a General Herald Farn, suggested that they simply maintain the blockade and wait for the planet to starve out and surrender.
This tactic was all but unheard of and many called it cowardly. It was all the more surprising when it came from a human whose people are known to be well renowned warriors.
General Farn counters the dissenting arguments by stating that it is likewise madness to send their ground forces into a meat grinder and waste them so needlessly.
Taking control of the holographic display, General Farn recommended that by surgically striking the store houses holding the rations and food supplies the Reni would be unable to maintain rationing for long. He then provided data that while the planet was a military world with extensive supply facilities, it was never intended to have such a large garrisoning force for extended periods.
General Farn promised that within a week they would begin seeing results.
Star date 3751.13
Initial bombardment operations were successful.
Over 67% of supply depots that were targeted were destroyed by orbital bombardment in addition to knocking out all satellite communication platforms orbiting the planet . A following 12% were damaged but note entirely destroyed while the remaining 21% received minor to no damage. Anti-orbital defenses around the remaining 21% was too strong for naval forces to breach long enough to carry out precise strikes and after losing three cruisers and ten frigates the coalition navy withdrew and considered the operation done.
After that scans showed renewed activity by ground forces to further strengthen their areas in the days following. New trench lines were dug and the remaining supplies were dispersed to prevent another critical loss.
Reni forces remained on active alert for four days straight but with no follow up strikes they were largely left to their own devices. With no communication between the defense pockets larger coordinated defense efforts ceased.
Several small parties were seen leaving the larger defended areas and going out into the few wild areas that remained on the planet. Analysts summarized that commanders were now foraging for provisions to supplement their dwindling stocks.
General Farn took this as confirmation that the plan was working and within the coming weeks the Reni ground forces would surrender. The coalition leaders agreed and allowed the plan to continue.
Star date 3751. 20
Larger foraging parties have now been seen departing the fortified enclaves and spreading out further in search of supplies. While unconfirmed it has been seen that several of these detachments have engaged in small skirmishes with other foraging parties from different enclaves.
Two weeks have passed since the supply depot bombardments and the rapidly degrading state of the Reni military can be seen from orbit within the cities. Small fires have broken out in the more fortified enclaves while smaller redoubts have entirely emptied of all personnel. It is unknown if this was by order or general desertion due to lack of supplies but the number of abandoned bases is increasing.
Star date 3751.27
First displays of open aggression between fortified enclaves have broken out as the supply situation has reached critical mass.
The spaceport under the command of Reni General Hopi was attacked. Spy drones were dispatched to the surface and returned video feeds depicting the attackers to have come from the Central Command Block under the command of Reni General Filar.
Military intelligence had prior to the Mendigold campaign had shown there had been a deep hatred between the two generals that went far beyond minor squabbling. With the cut off of supplies the hatred may have boiled over into outright violence, as Filar may have believed Hopi still had supplies left at the spaceport.
For the first time in recorded history Reni fought Reni and the Coalition watched as massed Reni infantry fought against waves of the spaceport defense air force.
While this engagement was the largest battle in the brewing Reni civil war on the planet it was not the only conflict. All over the globe Reni forces were fighting each other in a desperate need for supplies.
Star date 3751.30
The situation on the surface is now entirely untenable.
While Coalition leaders have agreed the starvation tactic was successful at weakening the Reni forces, it has produced the unhealthy side effect of triggering a massive Reni civil war planetside.
Central command no longer exists for the Reni as each enclave commander has now become a sudo warlord in their own right. There is no one leader to open a dialogue with now to demand terms of surrender, and even if they did accept it they would only be able to stand down the forces loyal to them while other warlords would continue fighting.
Coalition leaders have no idea how to defuse the situation, with only General Farn calling the operation a success.
Reni forces have begun dwindling rapidly as each enclaves seeks to horde whatever few supplies remain. Estimated Reni casualties now range in the 500 million range with more expected in the coming days as starvation finally takes its full toll on the Reni population.
With no concise plan of action and the planet now having lost all tactical value, the Coalition fleet has decided to continue the campaign and leave Mendigold to its fate. A tribunal has been called to have the human General Farn answer potential crimes that resulted from his order. Until the war is over however it is not likely that the human would lose his position or rank.
34 notes · View notes
mariacallous · 1 year ago
Text
Even if Israel and Hamas agree to a cease-fire and it holds, normalcy will not return to Gaza anytime soon. For the Palestinians living there, the biggest long-term danger they face may not be Hamas or Israel—it could be a lack of government altogether. A postwar Gaza may join the ranks of Libya, Somalia, Yemen, and other states that suffer near-constant low-level strife, endemic crime, and humanitarian crisis after humanitarian crisis. Such states tend to produce waves of desperate refugees and can fuel further violence.
In Gaza’s modern history, different regimes have ruled the strip, rarely doing so for the benefit of its residents. After the British colonial presence ended in 1948 and a war commenced over Israel’s independence, Egyptian troops advanced into Gaza as part of their attack on Israel, and they kept control of the region in the 1949 armistice agreement between Egypt and Israel. In the years after, Egypt sought to both suppress and exploit Palestinian activism and political Islam. Palestinian cross-border raids were an instrument against Israel, but they could create an escalatory spiral. In the 1950s, repeated cross-border raids contributed to Israel’s decision to go to war against Egypt in 1956.
When Israel took power after conquering Gaza in the 1967 Arab-Israeli War, it also feared Palestinian activism, though it was more permissive than Egypt in allowing political Islam to develop. Under Israeli rule, Gaza’s economy improved, but the enclave remained poorly governed, with Israel less concerned about the well-being of Palestinians and more worried about their support for Palestinian nationalism.
The Palestinian National Authority, a forerunner of the Palestinian Authority (PA), took over the governance of Gaza and portions of the West Bank as part of the Oslo Accords, assuming control in 1994 under Yasser Arafat’s leadership. Although finally under Palestinian rule, the Palestinian leaders were primarily from the diaspora, not Gaza, and the PA focused more on the West Bank. Again, Gaza remained neglected.
Israel reoccupied Gaza during the Second Intifada, which began in 2000, and it withdrew in 2005. Although Israel’s campaign against Hamas weakened its terrorist capabilities, Hamas won a parliamentary election in 2006 and then seized power in Gaza in 2007. Finally, a Gaza-based organization was running Gaza. In some ways, life for Palestinians in the enclave improved—despite Hamas’s repressive ideology. It cracked down on crime, crushed local warlords, provided health and educational services, and was less corrupt than the PA.
At the same time, Israel and much of the international community rejected Hamas’s legitimacy. The group continued waging sporadic attacks on Israel, and Israeli governments placed severe limits on Gaza’s economic development and regularly engaged in destructive military campaigns in the enclave. Israel tried to balance this with limited economic concessions to Gaza, such as issuing more worker permits and offering greater fishing rights, and allowing millions of dollars in aid from Qatar to go to Gaza if Hamas stopped military attacks—a policy that Israel thought was working until the attacks on Oct. 7, 2023, dispelled this illusion.
Whatever limited gains Palestinians in Gaza may have made under Hamas rule have been shattered by Israel’s military response to the attacks committed by Hamas on Oct. 7. The Israeli campaign has killed more than 30,000 Palestinians and displaced 1.9 million people—85 percent of the enclave’s population. More than half of Gaza’s buildings had been damaged or fully destroyed by late January. The United Nations estimates that Gaza will need decades to recover at the cost of tens of billions of dollars—money that may never be provided. And U.N. officials warn that famine and disease will soon sweep the strip.
Some aid does get in, but much of it does not go to the neediest. Hamas has blended in with the population, and much of the day-to-day governance of Gaza, including policing, is gone as Israel continues to target what it considers to be Hamas’s infrastructure in Gaza. Criminal gangs now regularly rob unprotected aid convoys, selling what they steal to desperate Gazans.
Although a cease-fire would reduce some of the suffering, it does not resolve the most important political question: Who will rule Gaza? Israel understandably does not want the Hamas regime that murdered around 1,200 of its citizens to return to power and vows to destroy Hamas. Yet all the other contenders for power are weak, including the PA.
Any non-Hamas government has to worry about two sets of armed actors. Israel is likely to continue at least limited operations against the remnants of Hamas, assassinating its leaders and otherwise trying to prevent the group from reconsolidating. Hamas, for its part, might attack any interim government in order to ensure its political preeminence, and this is especially true if that government cooperates with Israel on security.
As for Gaza’s economy, even massive aid would not restore conditions that existed before Oct. 7—tenuous as they were. Although initial reports claiming that Palestinians from Gaza who worked in Israel had aided the Oct. 7 terrorists appear to be wrong, suspicion toward Palestinians among Israelis will remain high. Israel has accused the U.N. Relief and Works Agency (UNRWA)—which for decades has provided education, health care, food, and other services in Gaza—of being penetrated by Hamas. More broadly, the Oct. 7 attacks discredited Israel’s carrot-and-stick approach of offering limited economic concessions backed up by the threat of force to encourage moderation from Hamas. The future will see few carrots and far more sticks.
Foreign aid, while necessary for daily survival in Gaza, also comes with its own long-term risks for any government in the enclave. Aid from outsiders could have a corrupting effect, making any government less accountable to its own people; ordinary Palestinians in Gaza would have no way to stop officials from siphoning off aid and abusing their power.
This mix of chaos, privation, and conflict poses long-term risks not just for Palestinians, but also for Israel and the rest of the regional states. Many of Gaza’s people will try to leave if they can, with next-door neightbor Egypt being the most likely destination, should Cairo allow it. These conditions are also natural feeders for violent groups, which can easily recruit young men who need a paycheck and are bitter toward Israel and the international community for their role in Gaza’s desperate condition.
The Israel-Hamas war’s end would only mark the end of a chapter in the book of Palestinian suffering: The next chapter may be about the chaotic postwar period. Too often, U.S. and international policy in the region is focused on establishing a cease-fire or beginning negotiations, and not enough on lessening the suffering of ordinary people.
To reduce the risk of long-term state failure, the United States, European Union members, and others hoping for a solution to the conflict should focus on who will rule Gaza and how that entity’s rule will be enforced in the long term.
15 notes · View notes
screenmobile · 4 months ago
Text
What is the Best Way to Enclose a Patio?
A patio is one of those spaces that can be whatever you want it to be—a peaceful morning coffee spot, a family gathering place, or even a cozy workspace with fresh air. But let's be honest: weather, bugs, and unpredictable seasons can make it tough to use year-round. That’s where enclosing your patio comes in.
Now, you might be wondering: What’s the best way to enclose a patio? Well, that depends on your goals, budget, and how much time you're willing to invest. Whether you’re looking for a simple screened-in space or a full-blown sunroom, there are plenty of ways to turn your patio into a functional, comfortable retreat. Let’s break it down.
Why Enclose Your Patio in the First Place?
Before getting into the details, let’s talk about why enclosing a patio is a great idea in the first place.
Weather Protection – If you live in South Bend, you know the Midwest doesn’t mess around when it comes to weather. Rain, snow, and intense summer heat can make your patio unusable for months at a time. An enclosure helps you enjoy it year-round.
More Living Space – A patio enclosure is like adding an extra room to your home without the headache of a full renovation.
Pest Control – Mosquitoes, flies, and other insects can turn a relaxing evening outside into an annoying battle. A screen enclosure keeps the bugs out without blocking the fresh air.
Increased Home Value – A well-enclosed patio can add curb appeal and boost your home’s resale value, making it a smart investment.
Energy Efficiency – Some patio enclosures can help insulate your home, reducing heating and cooling costs.
Now that we know why it’s worth considering, let’s get into the different ways to enclose your patio.
Screen Enclosures: Keep the Fresh Air, Lose the Bugs
If you love the feeling of an open-air patio but hate dealing with insects, a screened-in patio is the way to go. This is a popular choice because it’s budget-friendly, easy to install, and still lets you enjoy the breeze.
Best For: Homeowners who want a cost-effective enclosure while keeping an outdoor feel. Cost Estimate: $5,000 – $10,000 (varies based on materials and size) Pros:
Keeps bugs out while allowing airflow
Affordable and quick to install
Doesn't block your view of the yard
Cons:
Provides no insulation—still cold in the winter
Screens can tear and need occasional replacement
If you want something a little more substantial but still want fresh air, let’s move on to glass enclosures.
Glass Enclosures: More Protection, Better Views
A glass-enclosed patio (sometimes called a sunroom or four-season room) gives you the best of both worlds—natural light and outdoor views, without dealing with the elements. Depending on the glass type, you can even insulate it enough to use year-round.
Best For: Homeowners looking for a high-end, all-season patio space. Cost Estimate: $15,000 – $40,000 (depending on insulation and materials) Pros:
Provides full weather protection
Adds value to your home
Can be insulated for winter use
Cons:
More expensive than screens
Requires professional installation
Can get hot in summer without proper ventilation
A glass enclosure is a great way to extend your living space. But if you want a more flexible option, let’s talk about retractable enclosures.
Retractable Patio Enclosures: The Best of Both Worlds
A retractable enclosure is a flexible solution that lets you switch between an open and enclosed patio whenever you want. These can be motorized screens, sliding glass walls, or even retractable roofs—depending on how fancy you want to get.
Best For: Homeowners who want control over when their patio is enclosed. Cost Estimate: $10,000 – $30,000 Pros:
Adjusts based on the weather
Lets in fresh air when wanted
Adds modern style to your home
Cons:
Can be pricey
Requires regular maintenance
A retractable setup is perfect for homeowners who want to enjoy both open and enclosed patio options. But what if you’re looking for something even more versatile?
Vinyl or Acrylic Enclosures: A Budget-Friendly Alternative
If you like the idea of glass but want to save some cash, vinyl or acrylic panels might be the answer. These enclosures look like glass but cost significantly less.
Best For: Homeowners who want a weatherproof patio without breaking the bank. Cost Estimate: $8,000 – $20,000 Pros:
Less expensive than glass
Blocks wind and rain
Some panels are removable for fresh air
Cons:
Not as durable as glass
Can yellow or scratch over time
This is a solid choice if you want an enclosed patio without the full commitment of glass walls.
DIY vs. Professional Installation: Should You Do It Yourself?
Thinking about enclosing your patio yourself? It’s possible—but it depends on the complexity of the project.
DIY is great for:
Simple screen enclosures
Temporary vinyl enclosures
Adding curtains or roll-down shades
Hire a pro for:
Glass enclosures
Permanent structures
Motorized or retractable setups
If you're in South Bend, working with a professional like Screenmobile South Bend ensures your enclosure is properly installed, durable, and meets local building codes.
Final Thoughts
So, what’s the best way to enclose your patio? It depends on how you plan to use it:
Want an affordable way to keep bugs out? Go with a screen enclosure.
Looking for a year-round living space? A glass enclosure is your best bet.
Want flexibility? A retractable enclosure lets you switch between open and closed.
Need a budget-friendly alternative to glass? Consider vinyl or acrylic panels.
No matter which option you choose, an enclosed patio adds value, comfort, and functionality to your home. If you’re ready to transform your patio, Screenmobile South Bend can help you find the perfect enclosure for your space.
Thinking about enclosing your patio? Contact Screenmobile South Bend today and turn your outdoor space into the perfect retreat!
7 notes · View notes
bimpro123 · 10 days ago
Text
Industry Foundation Classes and Open BIM: What You Need to Know
Tumblr media
IFC and OpenBIM: What You Need to Know?
In the modern AEC world, digital collaboration is essential. The process of designing, building, and maintaining structures involves numerous teams, software tools, and specialized workflows. IFC (Industry Foundation Classes) and OpenBIM are two core elements that enable these teams to work together, regardless of the software tools they use, without sacrificing the accuracy and integrity of the data.
These concepts are not just technical jargon—they are central to how the construction industry is evolving. Let’s dive deeper into what these terms mean, how they function, and why they are essential to the future of construction projects.
What is IFC (Industry Foundation Classes)?
IFC (Industry Foundation Classes) is a neutral and open file format developed by buildingSMART International for the exchange of data in a digital building model. It is used primarily within the BIM process to ensure that data can be shared across various software applications.
Why IFC Matters?
In simple terms, IFC acts as a bridge between different software applications. Architects, engineers, contractors, and others involved in a building project often use different software tools, but if those tools can’t talk to each other, collaboration becomes difficult. For example, a structural engineer may be using Autodesk Revit while the architect might prefer ArchiCAD. Historically, exchanging files between these platforms was challenging, often leading to data loss or errors. 
That’s where IFC comes in. IFC is designed to maintain the data integrity of building models and ensure that critical information, like dimensions, materials, structural components, and even energy performance data, can be shared and understood by different software.
An important thing to note is that IFC is not just a CAD file. It’s a data-rich model format that holds information not only about the geometry of a building but also about its functional characteristics—like fire resistance, thermal performance, and cost estimations.
How IFC Works?
When an IFC file is created, it contains an abstract representation of a building’s physical components and the relationships between them. Unlike CAD files that are primarily visual, IFC files contain deep metadata—each element in the file is tagged with specific data. For example, an IFC file for a door could include its size, material, supplier, fire rating, and installation requirements.
IFC works by using classes and objects. Each class represents a building element, such as walls, windows, doors, etc., and each object within that class contains the detailed properties of that element. For instance, an object for a window may specify its size, material (e.g., aluminum frame), and any associated thermal or acoustic properties.
This level of detail is essential because it allows for much more than just a geometric representation. It enables teams to analyze and collaborate on the building’s performance, not just its shape.
What is OpenBIM?
OpenBIM is a philosophy, a movement within the AEC industry, advocating for the use of open standards like IFC to enable better collaboration. In the traditional world of proprietary BIM tools, each software vendor controls the formats their software uses. OpenBIM flips that model by promoting interoperability—the idea that software programs from different vendors should be able to communicate with each other seamlessly.
Key Principles of OpenBIM
Open Standards: OpenBIM relies on non-proprietary, universally accepted standards, with IFC being the most well-known example. Open standards ensure that data can be accessed, used, and understood by anyone, now and in the future.
Interoperability: One of the key goals of OpenBIM is to ensure that different software tools can work together seamlessly. Whether you’re using Revit, ArchiCAD, Tekla Structures, or any other BIM tool, OpenBIM makes it possible for everyone to work from the same set of information, regardless of their software preferences.
Collaboration: OpenBIM emphasizes open collaboration between all project stakeholders, such as architects, engineers, contractors, and owners. The key here is that no one is restricted by the software tools they use—everyone can work together on the same model, share data, and access updates in real time.
Long-term Access: OpenBIM ensures that the data is not locked into a specific software ecosystem. This is particularly important for long-term facility management and for projects that need to last for many decades, as it guarantees future generations will be able to access, update, and modify the model.
What are the benefits of IFC and OpenBIM?
IFC and OpenBIM bring a lot of important benefits to the construction and design industry, especially when it comes to working with different people and software. Here is the simple breakdown of their benefits:
Works Across Different Software
One of the biggest benefits of IFC and OpenBIM is that they allow different software to talk to each other. Architects might use ArchiCAD, engineers might use Revit, and contractors might prefer Tekla or Navisworks. Normally, these programs don’t work well together. But when they all export or import files using the IFC format, everyone can share and view the same model without needing to switch tools. This makes communication much smoother across the whole project team.
2. No More Being Stuck With One Software
IFC gives you freedom. In the past, once you started a project in one software, you were stuck with it. If you wanted to switch, you’d often lose data or have to redo your work. With IFC, that’s not the case. You can move your project from one tool to another without losing information. This means you’re not tied down to one brand or subscription—you can choose what works best for you and your team.
Makes Teamwork Easier
OpenBIM promotes better collaboration. When everyone can see the same model and work with the same information, it reduces confusion. Instead of passing files back and forth and hoping they line up, all team members can contribute to one shared digital model. This keeps everyone on the same page, which saves time and avoids costly misunderstandings later in the project.
Keeps Your Data Safe for the Future
IFC is an open and non-proprietary format, which means it’s not owned by any software company. This is great for long-term projects or future building upgrades. Even if the software you used becomes outdated or goes out of business, your IFC files will still be readable. This helps protect your work and keeps it usable for many years.
Helps Catch Mistakes Early
Using IFC and OpenBIM makes it easier to check the project model for errors before construction begins. Tools like Solibri or Navisworks or Revizto can read IFC files and run clash detection to find problems, such as pipes running into beams or walls overlapping. Fixing these issues during the design phase is much cheaper and faster than fixing them on site.
Saves Time and Money
When everyone is working with accurate and up-to-date information, it reduces the chances of rework and delays. You avoid doing things twice or making changes late in the process. This leads to faster decision-making, fewer errors, and more efficient construction. In short, OpenBIM helps your project run more smoothly and saves money in the long run.
Helps with Rules and Regulations
More and more governments and public projects are now requiring the use of IFC and OpenBIM. This is because open formats make it easier to review and manage project data. If your team is already using these standards, it becomes easier to qualify for public or international projects and follow local building regulations.
Makes Facility Management Easier
Once the building is finished, the IFC model can still be very useful. It can contain information about equipment, materials, room sizes, maintenance schedules, and more. Facility managers can use this model to operate and maintain the building more effectively. This saves money over time and keeps the building in better shape.
Increases Transparency
OpenBIM makes it clear who did what and when. Since all information is openly shared and recorded, it’s easier to track decisions and changes. This builds trust among project partners and helps avoid disputes. Everyone knows what’s happening and has access to the same information.
Keeps You Ready for the Future
Finally, adopting IFC and OpenBIM now helps you stay ahead in the industry. As technology continues to grow, the push for open standards will only increase. By using them today, you’re preparing your team to handle future projects, tools, and client expectations with confidence.
Tumblr media
Challenges of Using IFC and OpenBIM
While OpenBIM and IFC offer a lot of great advantages, like better collaboration and freedom from software lock-in, they’re not without a few bumps in the road. Just like with any technology, there are some challenges you might run into when using them in real-life projects.
One of the common problems is software compatibility. Even though IFC is meant to be a universal format, not every software handles it the same way. Some programs might not fully support all parts of an IFC file, or they might display things incorrectly. For example, a wall created in one software might look different or lose some details when opened in another. This can lead to confusion, missing data, or even mistakes in the project if the information doesn’t come through the way it was intended.
Another issue is file size and performance. IFC files can get really big, especially on large and detailed projects. When you have a model with thousands of elements—walls, floors, furniture, MEP systems—it all adds up. These heavy files can slow things down, especially if the computer or the software isn’t built to handle such complex models smoothly. This might make navigation clunky or even crash the program sometimes, which can be frustrating during tight deadlines.
There’s also a bit of a learning curve involved. If your team is used to working only in one software like Revit or ArchiCAD, switching to OpenBIM workflows can feel unfamiliar at first. Understanding how IFC works, how to export and import files properly, and how to troubleshoot issues can take some time. It’s not overly complicated, but it does require a bit of training or hands-on experience to get comfortable with it—especially when dealing with advanced coordination or data-rich models.
Lastly, there’s the challenge of lack of standardization. Even though IFC is an open standard, there are actually different versions of it—like IFC2x3 or IFC4—and not every project team uses the same one. On top of that, some teams only use specific parts of IFC, depending on their needs. So, when two different teams exchange models, their versions or interpretations might not align perfectly. This can lead to inconsistencies or gaps in the data, which kind of defeats the purpose of having a shared format in the first place.
Who Should Use IFC and OpenBIM?
When it comes to working on building projects, a lot of different professionals are involved—architects, engineers, contractors, and facility managers. Each of them plays a big role at different stages of the project. But here’s the thing: they don’t always use the same software. That’s where IFC (Industry Foundation Classes) and OpenBIM come in. These tools are like a common language that helps everyone share information without worrying about which software they’re using.
Architects should definitely use IFC and OpenBIM. Why? Because it allows them to send their 3D design models to other team members without any headaches. Imagine you’re an architect using ArchiCAD, and your structural engineer is working in Revit. Normally, that might cause issues. But with IFC, you can both share and view the model clearly, without losing any of the details. It helps architects keep their design intent intact, no matter who’s looking at the file.
Now let’s talk about engineers—whether they’re structural, mechanical, electrical, or civil. Engineers often need to work closely with architects and coordinate with other teams. Using OpenBIM allows them to share their models, run clash detection, and do analysis, even if they’re using different tools. For example, a mechanical engineer can send a model with all the ducts and pipes, and a structural engineer can check it to make sure nothing clashes with beams or walls. That’s teamwork made easier.
Contractors and builders also benefit a lot from OpenBIM. During construction, they need access to detailed and accurate models to understand what needs to be built. With IFC files, they can see everything in one place, from materials to dimensions. It helps them plan better, avoid mistakes, and keep the project on track. They don’t have to worry about what software the model was made in—OpenBIM makes sure it works for them too.
Finally, facility managers and building owners have the longest relationship with the building. Even years after construction is done, they still need to maintain and update the building. With IFC, they can receive a digital twin of the building that doesn’t depend on a specific software. This means they can access all the building information—like room sizes, equipment, and maintenance schedules—without any compatibility issues. It gives them flexibility and long-term control over their property.
Conclusion
In today’s construction landscape, where collaboration, efficiency, and accuracy are paramount, IFC and OpenBIM are fundamental to ensuring the smooth execution of projects. By embracing these open standards, you’re ensuring that your team can work together more effectively, using the best tools for the job while maintaining data integrity across different platforms.
Incorporating IFC and adopting the OpenBIM approach into your workflow will not only future-proof your projects but also create a more transparent, efficient, and collaborative process from design through to facility management.
1 note · View note
justinspoliticalcorner · 9 months ago
Text
Don Moynihan at Can We Still Govern?
Authoritarian regimes tend to prioritize controlling certain institutions: bureaucracy, the legal system, and higher education. Trump has shown deep interest in all. They also seek to control the press. It is easy to assume this last threat does not apply to the US. The first amendment provides strong protections to private media (although conservatives are interested in eroding those protections). The Corporation for Public Broadcasting is vulnerable to right wing threats of defunding, but NPR and PBS have other sources of funding and play a less central role in American media than, for example, the BBC does in the UK. Conservatives have their own thriving media, have captured large areas of social media, including Twitter, and have been successful at working the ref when it comes to the mainstream media. It might therefore seem that there is little reason to worry about Trump further weaponizing the press. But there are aspects of the media that the US government does directly control and fund. Americans don’t pay a lot of attention to it, because it does not broadcast to them, but it has an outsize global presence. This is the U.S. Agency for Global Media, which oversees Voice of America, and is the primary funder of several other regional broadcasting networks. It has a budget of about $1 billion and 4,000 employees. While it does not have the same brand recognition as other media, AGM has a vast global audience. It estimates that it reachers over 420 million people weekly in 64 different languages. So, the US government is very much in the media business, and on a global scale. To understand how that media can be weaponized by an authoritarian in the future, we just need to look back at the final months of the Trump administration.
“Gross Mismanagement” by a Trump Appointee
The tensions between journalistic integrity and politicization exploded with the arrival of Michael Pack, the chief executive of AGM. In Spring of 2020, Trump falsely accused VOA of peddling Chinese propaganda over Covid, and then pushed Pack’s nomination, which had been stalled because of Senate concerns about his ideological leanings, and financial improprieties in his business dealings. Pack headed the Claremont Institute, which has pushed anti-democratic messaging via fellows like John Eastman. Pack had also worked on documentaries produced by Steve Bannon.
Pack perfectly reflected the paranoid style of that took hold in Trump’s last year in office. After his first impeachment, loyalty mattered above all else to Trump. Pack fit that agenda. He promised “to drain the swamp, to root out corruption, and to deal with these issues of bias” — an ethos that defines how Trumpworld is approaching a second term.
The scale of Pack’s abuses only became evident after he left the agency. He profiled his employees, speculating on their political leanings, and met with officials to prioritize which of these officials should be fired because of those political beliefs. He pushed out senior officials, putting some on administrative leave for daring to disagree with him. Pack hired a conservative law firm on a no-bid contract to investigate his own employees. It cost $1.6 million dollars. He revoked security clearances from journalists who complained about his practices. He sought to put in place unqualified appointees to oversee overseas media networks, and make their tenure permanent. His political appointees either had no experience in journalism, or were aligned with right-wing media outlets such as The Daily Caller.
Pack improperly tried to block funding for one unit, The Open Technology Fund, that Congress had appropriated. He froze the funding of other units, and blocked their recruitment efforts. Pack also created instability by removing senior officials leading parts of his organization — Radio Free Europe/Radio Liberty, Radio Free Asia, the Office of Cuban Broadcasting, the Middle East Broadcasting Networks and the Open Technology Fund.
[...] Making AGM a more explicitly partisan outlet would obviously undermine its reputation for providing quality journalism, undermining trust in the organization that is central for its ability to do its job. Its a clear example of how a more partisan operation reduces the quality of agency performance, and its ability to fulfill its mission. The mission of AGM “is to inform, engage, and connect people around the world in support of freedom and democracy.” It is hard to do so if the agency is, itself, dancing to an authoritarian tune.
Don Moynihan’s Can We Still Govern? post reveals that the Voice Of America, which is targeted towards international listeners, could become a partisan propaganda organ if Donald Trump wins again.
4 notes · View notes
dollsonmain · 2 years ago
Text
Not about being sick, but about money related to being sick.
Long blathering.
That Guy is now waffling on getting me insurance which comes as no surprise.
He left me alone and unguided other than "Just go to the ACA website and pick one." to search for an insurance policy and then when I came back to him with 6 different ones, he said I was researching it wrong.
I was like, "You did not give criteria. If you want something researched under certain criteria, you have to say that from the start." He didn't like that and asked me with full incredulity why didn't I already know everything there is to know about health insurance?
After some back and forth he was aghast and asked me did I seriously never, in the past two decades, get curious about how health insurance works and research it.
No. Why would I? Insurance was not available to me so it didn't matter how it worked. Insurance I'd had before was provided by my workplaces with no options as to which company or policy type I had. I am not curious about and not spending my free time researching the intricate inner workings of health insurance that I couldn't have anyway.
All I know is what others have mentioned here: Sometimes it's really difficult to get the right treatment and meds because insurance doesn't want to pay for it.
That he acts like I'm stupid for not knowing something like that is really irritating because it's not like he sits around learning everything there is to know about income tax.
I had reason to research income tax regarding hobby income, and so I did.
I informed him of something he didn't know about income tax: He can deduct my medical bills as long as he continues to claim me as a dependent. I didn't treat him like he was an idiot for not knowing that because he's never been curious about it, never sat down and researched it, didn't even know to wonder about whether or not that was a possibility, and pays a tax guy to do it all for him with the absolute minimum input and responsibility on himself.
The same situation I've been in regarding health insurance.
Anyway, he was also like "Calculate up all of the premiums and the expected yearly out of pocket, and if that is more than [parathyroidectomy] then it's not worth it to get insurance, and we won't."
Which... Is illogical....
The surgery is approximately $20k not including physicians fees, anesthesia, etc. and also not factoring in the self-pay 50% discount. But that's a vague estimate.
We've spent about $80k this year on surgeries, but he's not factoring that unexpected expense in to his expected medical expenses moving forward, he's only thinking about the surgery and not anything else like long-term medication, follow ups, complications, or anything else that might happen unrelated to problems we already know about.
The insurance policies I was looking at would cost between $16k and $18k/year between the premiums and the max projected out-of-pocket expense. We both know there's a risk of those policies deciding not to cover the cost of my medical needs and because of that he doesn't want to pay into a policy that isn't going to give me the coverage he wants. He mentioned being forced by the insurance company to take medication forever instead of getting surgery, and I countered that I may need to take medication forever even after surgery because of how severe my case is. I have a high risk of my blood calcium swinging hard into hypocalcemia because it's so excessively hyper- right now.
So, according to the criteria he's made known, the only medical care I will need going forward until I die is this parathyroidectomy and that's the only determining factor regarding whether or not he's willing to pay for a health insurance policy for me.
-
He also mentioned me finding work, specifically work from home, which he didn't want me to do before because he liked being 100% in control financially and me earning my own money means a chance of freedom from him because I wouldn't need him to cover my living expenses.
I told him that in order to work I really do need to be managed so working from home isn't going to work for me. Just look at how efficiently I sold ponies this year. I made 0 listings this year. I hardly touched a pony all year.
He mistook that as me saying I need my work place and my home place to be separate, but that's not what I was saying.
Either way, work from home probably isn't going to work for me not only because I will either work incessantly until I hurt myself or not work at all, but also because every time I've mentioned it he starts in again on not wanting foreign devices on our home network, or me going to weird websites that could infect my computer and then the home network, or, those foreign devices surveilling him I mean us like the government would if I were on Medicaid or......
Also, I've been trapped in this house for 16 years. I wouldn't mind an inarguable excuse to leave.
That does mean he'd have to get me a car, which he also doesn't want to do because that's more money.
-
I had an ok if painful job and preposterously good insurance before he knocked me up by ignoring our agreed upon contraceptive method and then decided we should move way out into a suburb with no walkable work available into a half-a-mil house in the middle of nowhere.
I do not sympathize with his financial gripings.
-
He's also said that he's looking for somewhere else to work but thinks he screwed himself in the industry.
The job he's done for the past 20 years is one where your reputation and contacts mean more than anything else, and he said he's been "Very vocal about everything that's wrong with this country..." for the past few years.
Like I'd said before, the pandemic brought out the worst in him, and he's been shitting on all of his relationships since, apparently including professional ones. I do know he yelled at the person that was his shoe-in to finding different offices looking for people because he didn't like government forcing everyone to wear masks during the height of the pandemic.
That's stupid on his part because he has no other experience or education to fall back on in order to jump to another career path. Outside of the industry, he's just another useless rando with a high school diploma, no work experience, and no higher education.
He likes to think he's smarter than everyone else, but isn't smart enough to keep his personal political opinions out of the workplace.
Not that I'm in any different a situation, education-wise, but I've not had access to money to change that and he has.
8 notes · View notes
coochiequeens · 1 month ago
Text
Save the Earth. Don't have sons.
By AJIT NIRANJAN
Tumblr media
Eating red meat and driving cars explain almost all the difference in pollution between men and women.Imago/ZUMA
This story was originally published by the Guardian and is reproduced here as part of the Climate Desk collaboration.
Cars and meat are major factors driving a gender gap in greenhouse gas emissions, new research suggests.
Men emit 26 pecent more planet-heating pollution than women from transport and food, according to a preprint study of 15,000 people in France. The gap shrinks to 18 percent after controlling for socioeconomic factors such as income and education.
Eating red meat and driving cars explain almost all of the 6.5 to 9.5 percent difference in pollution that remains after also accounting for men eating more calories and traveling longer distances, the researchers said. They found no gender gap from flying.
“Our results suggest that traditional gender norms, particularly those linking masculinity with red meat consumption and car use, play a significant role in shaping individual carbon footprints,” said Ondine Berland, an economist at the London School of Economics and Political Science and a co-author of the study.
The disparaging term “soy boy” has been used by right-wing figures, including Vice President JD Vance
Research into gender gaps is often plagued by difficult decisions about which factors to control for, with seemingly independent variables often confounded by gendered differences. Men need to eat more calories than women, for instance, but they also eat disproportionately more than women. They also have higher average incomes, which is itself correlated with higher emissions.
Previous research from Sweden has found men’s spending on goods causes 16 percent more climate-heating emissions than women’s, despite the sums of money being very similar.
Marion Leroutier, an environmental economist at Crest-Ensae Paris and a co-author of the study, said: “I think it’s quite striking that the difference in carbon footprint in food and transport use in France between men and women is around the same as the difference we estimate for high-income people compared to lower-income people.”
The most powerful actions a person can take to cut their carbon pollution include getting rid of a gas-powered car, eating less meat, and avoiding flights.
But efforts to challenge car culture and promote plant-based diets have provoked furious backlashes from pundits, who perceive it as an attack on masculinity. The term “soy boy” has been used by far-right figures, including Vice President JD Vance and the self-described misogynist influencer Andrew Tate, to present progressive men as weak.
Soy is a common protein source in vegan cuisine, but three-quarters of the world’s soya beans are fed to animals to produce meat and dairy.
The French researchers suggested the gender differences in emissions could explain why women tend to be more concerned about the climate crisis, arguing the greater personal cost of reducing their emissions could cause men to avoid grappling with the reality of the climate emergency.
But they added that greater climate concern could lead women to do more to cut their emissions. “More research is needed to understand whether these differences in carbon footprints are also partly due to women’s greater concern about climate change and their higher likelihood of adopting climate-friendly behaviors in daily life,” Leroutier said.
3 notes · View notes
shockmastervoidstate · 1 year ago
Text
Your thoughts you must invest 💰, for they will manifest
I was inspired by a lecture from Neville Goddard to make a post on why you have to gain control over your mind and the thoughts that you think.
Often, thoughts are compared to seeds that will manifest your reality once they are planted.
It's a nice comparison, but I realized that you can also compare thought to money. And if you want your money to grow, you must invest it.
On average, a human being has 70,000 thoughts a day. You could compare that to a salary of 70,000 dollars.
This salary leaves a lot for you to invest. I must admit, many thoughts we have are aimed at solving problems at work or at school. Just like how we must spend some of our salary on necessary things like rent, food, gas etc.
But there will be cash left for us to spend or invest however we want. Let's assume that 50,000 dollars are spent on necessities, so we are left with 20,000 dollars to spend however we want.
It's the same with our thoughts. 50,000 are dedicated to chores and responsibilities, 20,000 are for ourselves. (Just an estimate to make a point, the actual numbers could be totally different.)
In between daily tasks that demand your full attention and your free time, especially at night when you go to bed, you have the opportunity to think about anything.
And you have the freedom to choose what you think about and how you think about it.
And you must choose wisely, because your thoughts create your reality. Whatever you give attention to grows until it becomes real.
You would want to invest your money in a way that makes you richer. At least you want to spend it on something you know you will appreciate.
You wouldn't waste it on something that you hate or on something makes you poorer. Imagine you buy a ship wreck. It's completely worthless, you can't do anything with it.
But wait, the ship wreck is actually less than worthless — it is actually damaging to your wealth, because it will cost you even more money. You could repair it for a huge amount of money, you can leave it at the docks, which would cost a lot.of rent, or you can have it delivered to a junkyard, which would also cost you a fortune.
Negative thoughts behave in the same way. They are damaging and instead of thinking them, you could have thought something that would have enriched your life.
When you buy junk with x amount of money, you lose the opportunity to buy an appreciating asset with x amount of money.
The thing is, the mainstream and materialistic school of thought is that thoughts have no inherent power, since reality doesn't react to them.
But we know, thanks to Neville Goddard and so many other wonderful people, that our reality does react to what we think — it mirrors our thoughts, assumptions and beliefs.
Negative thinkers don't know how much they ruin their own lives, but what's even worse is that they miss out on the opportunity to improve their lives and fulfill their biggest desires.
However, it's not easy to stop negative thinking. It's almost like an addiction and a vicious cycle — the more you think negatively, the more negative your life will get, providing you with more reasons to keep thinking negatively. Just like the ship wreck that will deplete your wealth even more, long after the purchase.
Only the most dedicated people will break free from their bad habit of thinking negatively, of repeating their old story. Neville said so himself.
I found an easy and effective way to break patterns of negative thinking. It is based on repeating positive affirmations whenever negative thoughts come up, blocking them.
I will make a post about it and explain how to learn it. It worked very fast for me and I can already see very positive changes in my life.
Till next time.
-shockmaster
2 notes · View notes
damiankoh · 1 year ago
Text
Finance 101 for marketers
Tumblr media
In the world of business, it is common for specialized departments to operate in their silos. One notable example is the marketing team, which often focuses on brand building, creative and customer engagement, sometimes at the expense of a deeper understanding of the broader business and commercial workings of the company. This gap can lead to misaligned strategies and lost opportunities.
These are some of the common terms that I have come across over the past decade that every marketer should have a basic understanding.
GMV (Gross Merchandise Volume): This is the total sales value of merchandise sold through a particular marketplace over a specific time period. It measures the size of a marketplace or business, but not the company's actual revenue since it doesn't account for discounts, returns, etc.
Revenue: This is the total amount of income generated by the sale of goods or services related to the company's primary operations.
COGS (Cost of Goods Sold): This refers to the direct costs attributable to the production of the goods sold. This amount includes the cost of the materials and labor directly used to create the product.
Gross Margin: A financial metric indicating the financial health of a company. It's calculated as the revenue minus the cost of goods sold (COGS), divided by the revenue. This percentage shows how much the company retains on each dollar of sales to cover its other costs.
Operating Income: This is the profit realized from a business's core operations. It is calculated by subtracting operating expenses (like wages, depreciation, and cost of goods sold) from the company’s gross income.
Ordinary Income: This typically refers to income earned from regular business operations, excluding extraordinary income which might come from non-recurring events like asset sales or investments.
Net Profit: Also known as net income or net earnings, it's the amount of income that remains after all operating expenses, taxes, interest, and preferred stock dividends have been deducted from a company's total revenue.
PPWF (Price Pocket Waterfall): This term is used to describe the breakdown of the list price of a product or service down to the net price, showing all the factors that contribute to the price erosion. The "waterfall" metaphorically illustrates how the price "falls" or reduces step by step due to various deductions like discounts, rebates, allowances, and other incentives given to customers. This analysis is important for businesses to understand their actual pricing dynamics and profitability. It helps in identifying opportunities for price optimization and controlling unnecessary discounts or allowances that erode the final price received by the company.
Net Present Value (NPV): A method used in capital budgeting and investment planning to evaluate the profitability of an investment or project. It represents the difference between the present value of cash inflows and the present value of cash outflows over a period of time.
Internal Rate of Return (IRR): A metric used in financial analysis to estimate the profitability of potential investments. It's the discount rate that makes the net present value (NPV) of all cash flows from a particular project equal to zero.
CONQ (Cost of Non-Quality): This is the cost incurred due to providing poor quality products or services. It includes rework, returns, complaints, and lost sales due to a damaged reputation.
A&P (Advertising and Promotion): These are expenses related to the marketing and promotion of a company's products or services. It's a subset of the broader marketing expenses a company incurs.
Return on Investment (ROI): In simple terms, ROI measures the profitability of an investment. For marketing teams, this means understanding how campaigns contribute to the company's bottom line, beyond just tracking engagement metrics.
Return on Ad Spend (ROAS): ROAS specifically measures the efficiency of an advertising campaign. It assesses how much revenue is generated for every dollar spent on advertising. It's similar to ROI but focused solely on ad spend and the revenue directly generated from those ads. ROAS is exclusively used in the context of advertising and marketing. It helps businesses determine which advertising campaigns are most effective.
Customer Lifetime Value (CLV): This predicts the net profit attributed to the entire future relationship with a customer. Effective marketing strategies should aim at not only acquiring new customers but also retaining existing ones, thus maximizing CLV.
G&A (General and Administrative Expenses): These are the overhead costs associated with the day-to-day operations of a business. They include rent, utilities, insurance, management salaries, and other non-production-related costs.
2 notes · View notes
travaholic · 1 year ago
Text
How Social Media Marketing Is Different From Traditional Marketing
Tumblr media
The marketing world has undergone a significant transformation in recent times. While traditional marketing channels like print, television, and radio still hold value, the rise of social media has opened a dynamic new avenue for businesses to connect with their target audience.
This article delves into the key differences between social media marketing and traditional marketing, highlighting the unique strengths and considerations of each approach.
Reach and Audience Targeting
·   Traditional Marketing: Relies on demographics and broadcasted messages. Newspapers, magazines, and television cater to a general audience within a specific area. Targeting specific demographics can be achieved through strategic ad placement, but true individualization is limited.
·   Social Media Marketing: Enables laser-focused targeting. Platforms like Facebook and Instagram allow marketers to target users based on a multitude of factors, including age, location, interests, and online behavior. This granular targeting ensures messages reach the most relevant audience segment, maximizing campaign effectiveness.
Interaction and Engagement
·   Traditional Marketing: Primarily a one-way communication channel. Print ads, billboards, and even television commercials are delivered without the ability for immediate audience response.
·   Social Media Marketing: Fosters two-way communication and engagement. Social media platforms provide a space for direct interaction between brands and their audience. Companies can respond to comments and messages, answer questions, and address concerns in real time, fostering a sense of community and loyalty.
Content and Creativity
·   Traditional Marketing: Content is static and limited by format. Print ads rely on visuals and text, while television commercials are restricted by time constraints.
·   Social Media Marketing: Offers a diverse range of content formats. Businesses can leverage images, videos, live streams, and even interactive polls and quizzes to capture audience attention and deliver their message creatively and engagingly.
Cost and Measurement
·   Traditional Marketing: Costs can be high, particularly for prime-time television slots or large-scale print campaigns. Measuring the return on investment (ROI) can be challenging, often relying on estimates and indirect metrics.
·   Social Media Marketing: Offers a more cost-effective approach. Many social media platforms provide free business accounts, and paid advertising options are often flexible and budget-friendly. Tracking campaign performance is significantly easier with built-in analytics that provides insights into reach, engagement, and conversions.
Data Analytics
·   Traditional Marketing: Limited data available to measure campaign effectiveness. Traditional methods often rely on estimates and indirect metrics like website traffic fluctuations or an increase in sales calls.
·   Social Media Marketing: Provides comprehensive data and analytics. Platforms offer insights into impressions, engagement metrics (likes, comments, shares), website clicks, and even conversion rates. This data allows marketers to refine their strategy, optimize content, and measure the true impact of their social media efforts.
Crisis Management and Brand Reputation
·   Traditional Marketing: Limited ability to address negative feedback or public relations issues promptly. Responding to criticism in print or television ads often requires time and additional resources.
·   Social Media Marketing: Enables real-time crisis management. Businesses can directly address customer concerns and negative feedback on social media platforms, allowing them to control the narrative and minimize potential damage to their brand reputation.
Additional Considerations and The Future of Marketing
While the core differences between social media marketing and traditional marketing have been addressed, here are some additional factors to consider:
Integration and Cohesiveness:
·       Combining traditional and social media marketing efforts can create a powerful synergy. Traditional channels can be used to drive users to social media platforms, where deeper engagement and conversions can occur.
·       Consistency in messaging and brand identity across all marketing channels is crucial for building a strong brand presence.
Rise of Influencer Marketing:
·       Social media has led to the rise of influencer marketing, where businesses collaborate with individuals who have built a large and engaged online audience.
·       Influencer marketing can be a highly effective way to reach a targeted audience and leverage the credibility and trust established by the influencer.
Evolving Social Media Landscape:
·       Social media platforms are constantly evolving, with new features and functionalities emerging regularly. Marketers need to stay up-to-date with these changes to adapt their strategies and ensure they are using the latest tools and trends to their advantage.
Focus on User-Generated Content:
·       Encouraging user-generated content (UGC) through contests, hashtags, and interactive campaigns can be a powerful social media marketing strategy.
·       UGC fosters a sense of community, increases brand authenticity, and allows businesses to leverage the creativity and reach of their audience.
The Future of Marketing: Embracing Personalization
Looking ahead, the future of marketing lies in personalization.
·       By leveraging the vast amount of data available through social media and other digital channels, businesses can tailor their content and messaging to individual user preferences and needs.
·       This level of personalization will be key to building stronger customer relationships and driving engagement in an increasingly competitive marketplace.
Conclusion:
Social media marketing has revolutionized the way businesses connect with their audience. While traditional marketing channels still hold value, the ability to target specific demographics, foster two-way communication, and measure campaign performance in real time makes social media an essential element of any modern marketing strategy.
By embracing the unique strengths of both traditional and social media marketing, businesses can create a comprehensive strategy that drives brand awareness, audience engagement, and ultimately, business growth.
3 notes · View notes