#DataKind
Explore tagged Tumblr posts
pythonjobsupport · 1 year ago
Text
DataKind Nonprofit Data 101: Data Governance & Security
Are you a social impact professional looking for some practical guidance on how to use data concepts or technologies? Do you … source
View On WordPress
0 notes
lanabriggs · 4 months ago
Text
AI and Data-Driven Philanthropy. Redefining Impact in the Digital Age
Tumblr media
Artificial intelligence has rapidly expanded its presence in sectors ranging from healthcare to finance. One area where its influence is growing, yet often underexplored, is philanthropy. Data-driven decision-making powered by AI is transforming how nonprofits, foundations, and donors identify needs, allocate resources, and measure impact. The integration of intelligent technologies is helping organizations do more with limited resources, make smarter choices, and deliver more targeted support to the people and communities that need it most.
The Role of AI in Shaping Modern Philanthropy.
Artificial intelligence can process complex datasets in real time, detect patterns, and predict outcomes — making it an ideal partner for mission-driven organizations. For example, AI systems can help identify communities at high risk of food insecurity, forecast where health interventions might be needed, or assess which programs are underperforming and why.
Some nonprofits now use natural language processing (NLP) tools to analyze feedback from beneficiaries and donors, uncovering insights into satisfaction, needs, and potential areas for improvement. Others rely on machine learning models to optimize fundraising campaigns or tailor communication to specific audience segments.
This isn’t about replacing human intuition or compassion — it’s about enhancing it. By using AI to process information at scale, philanthropic organizations can become more proactive, agile, and precise in their efforts.
Smarter Giving Through Data Integration.
AI becomes especially powerful when paired with large volumes of quality data. With growing access to open datasets from governments, NGOs, and social researchers, philanthropic institutions can combine their own data with public sources to build detailed community profiles and understand overlapping issues.
For example, combining health data with income levels and geographic information can help pinpoint areas where mobile clinics or mental health services would be most effective. Similarly, donor engagement platforms are increasingly using AI-driven analytics to identify patterns in giving behavior, enabling organizations to engage supporters more meaningfully and at the right time.
Data-driven philanthropy also improves transparency and accountability. Donors increasingly want to know where their money goes, how it is used, and what outcomes it produces. With the help of AI, organizations can track impact metrics and report on progress with greater accuracy.
Eric Hannelius, a fintech entrepreneur and advocate for responsible innovation, believes the shift toward data-informed giving reflects a broader transformation in how organizations define and pursue impact.
“AI is enabling a new era of philanthropy — one that values clarity and measurable outcomes,” says Eric Hannelius. “It’s allowing us to better understand where needs exist and how we can respond more effectively. This doesn’t take the heart out of giving — it brings deeper insight into how to direct that heart toward meaningful action.”
He also emphasizes the importance of ethics and intent in implementing such technologies. “Responsible use of AI means asking tough questions about bias, privacy, and access,” Eric Hannelius continues. “It’s not enough to have the right tools. You need the right framework guiding how they’re used.”
Applications in the Field: Real-World Examples.
Several organizations are already demonstrating the possibilities of data-driven philanthropy enhanced by AI:
GiveDirectly, a nonprofit that facilitates direct cash transfers, uses data modeling to identify the poorest communities in sub-Saharan Africa, ensuring that funds reach those most in need.
Charity: Water has adopted predictive maintenance powered by AI to keep rural water pumps functioning, helping them reduce downtime and ensure consistent access to clean water.
DataKind, a global nonprofit, connects data scientists with humanitarian organizations to develop AI tools for projects such as early childhood education in underserved communities and disaster response planning.
These efforts are showing that strategic application of data and machine intelligence can lead to more effective programs and deeper, long-term impact.
Balancing Innovation with Human Connection.
While the promise of AI in philanthropy is real, it must be handled with care. Algorithms can replicate bias if not trained properly, and over-reliance on data may lead some organizations to overlook qualitative experiences or smaller communities that don’t show up in big datasets.
To avoid these pitfalls, many organizations are investing in internal expertise, fostering collaboration between data scientists and field workers, and incorporating community voices into their data interpretation and decision-making processes.
The future of philanthropy will be built on a foundation that values both numbers and narratives.
Looking Ahead: What Comes Next?
As AI tools become more accessible and user-friendly, their role in philanthropy is expected to expand. Low-code platforms and plug-and-play analytics dashboards are already empowering small nonprofits to harness data in ways that were once limited to large institutions.
Expect further developments in predictive giving behavior, real-time program evaluation, and customized impact reporting. Partnerships between tech companies and nonprofits will also grow, enabling knowledge sharing and innovation on a global scale.
Still, the human element will remain essential. Technology can highlight problems and suggest solutions, but the decisions to act — and the ways to do so — must always be guided by empathy, trust, and a clear mission.
The convergence of artificial intelligence and philanthropy offers a remarkable opportunity: to design charitable efforts that are as thoughtful in execution as they are generous in intent. With the right combination of data, ethics, and insight, organizations can turn good intentions into sustained outcomes.
0 notes
wedesignyouny · 5 months ago
Text
Navigating the Data Science Job Boards Roundup - Dataforce
Navigating the Data Science Job Boards Roundup
Tumblr media
Key Highlights
Discover the top data science job boards to boost your career prospects.
Explore platforms like LinkedIn and Indeed, alongside niche options.
Find tailored resources for data analysts, engineers, and machine learning experts.
Learn how to craft a standout resume and network effectively.
Get ready to land your dream job in the booming field of data science!
Introduction
Starting a job search in the exciting world of data science can be both fun and tough. There are many data science job boards to choose from, so it’s important to target the ones that give you the best chance of success. This blog will be your complete guide. It will show you the top general and niche job boards for people who love machine learning, data analysts, and experienced data scientists.
Comprehensive Guide to Data Science Job Boards
Finding the right job opportunities takes careful planning. Knowing where to focus your efforts can greatly help. This guide is useful for anyone, whether you just graduated or have years of experience and want new challenges. Let’s explore data science job boards to help you with your job search.
Tumblr media
1. Essential Platforms for Data Science Opportunities
Your job search journey starts with key platforms that cover many industries and levels of experience. Here are the top sites for job search, providing a large dataspace for data science jobs:
LinkedIn: This well-known professional network is essential. It has great search filters, strong industry connections, and useful company pages. LinkedIn is important for anyone looking for a data science job.
Indeed: As one of the biggest job sites in the world, Indeed gathers listings from many sources. Its wide reach helps find a variety of opportunities.
Company Websites: Don’t overlook the value of checking the careers section of companies directly. Many companies prefer applications sent through their websites.
2. Niche Job Boards Tailored for Data Scientists
Once you know about the main platforms, think about checking out niche job boards specially made for the data science field. These sites focus on specific jobs and bring in top-notch employers:
DataJobs: This site is all about finding data-related jobs. It keeps things focused just on that.
Kaggle: Famous for its machine learning contests, Kaggle also has a job board that attracts the best tech companies.
Data Elixir: This site has a weekly newsletter that shows helpful job postings, which can save you time looking for jobs.
Tumblr media
3. Where to Find Data Analyst Roles
For both new and experienced data analysts, there are many platforms that can help you improve your skills. Here are some good places to focus on:
Analytics Vidhya: This well-known online community offers learning resources and has a job board just for data analysts.
DataKind: This group connects data professionals with projects that have a positive social impact. It provides unique opportunities.
Look closely at job descriptions on regular job boards, since many include data analyst jobs with other analytics roles.
4. Top Picks for Machine Learning and AI Positions
If you really like machine learning and artificial intelligence, these platforms can help you find great AI jobs:
MLconf: Known for its conferences, MLconf also has a job board. This connects businesses with talented people in machine learning and AI.
AIJobs.net: This job board concentrates only on the artificial intelligence field. It’s a great resource for fans of this area.
Towards Data Science: This well-known online magazine has a lively community. It also has a job board that shows job openings that relate to this field.
5. Best Sites for Data Engineering Jobs
For people who like to build and support the systems that help data-driven companies, these platforms are great for data engineering jobs:
Data Engineering Bulletin: This site shares a list of data engineering jobs, usually at top tech companies.
Stack Overflow: Although it’s not just for data engineering, this community for developers has a strong job board that includes a section for data engineer positions.
AngelList: This platform connects startups with talented people. It’s a great place to find data engineering jobs in exciting settings.
Tumblr media
6. Opportunities in Big Data: Where to Look
In the realm of big data, navigating the right job boards can lead to exciting opportunities. Here are some key platforms to consider:
Big Data Jobs: This niche platform focuses specifically on big data roles, providing a laser-focused approach to your search.
Dice: With a strong reputation in the tech industry, Dice offers a dedicated section for big data positions across various sectors.
Explore company websites of organizations known for their robust data teams, as they often advertise big data roles directly.
Job BoardDescriptionBig Data JobsSpecifically for big data rolesDiceTech industry platform with a big data sectionCompany WebsitesTarget companies with large data teams
Maximizing Your Data Science Job Search
Finding the right data science job is more than just looking at job boards. You need a smart plan to stand out from other job seekers. By creating a strong resume and using networking to your advantage, you can boost your chances of getting that ideal job. Here are some useful tips to help improve your job search efforts.
youtube
Crafting a Data Science Resume That Stands Out
Your resume is your first chance to make a good impression, so make it matter. Focus on your skills and experiences that relate to the data science jobs you want.
Show Your Technical Skills: Write down your programming languages, tools, and tech skills clearly. Don’t just list them; show how you used them in real projects.
Use Numbers for Your Successes: Include specific numbers to show the results of your work. For example, if you improved model accuracy, say exactly by how much.
Customize for Each Job: Change your resume for every job you apply to. Highlight the experiences and skills mentioned in the job description. Think about using a resume review service to make your application better.
Networking Strategies for Data Professionals
Networking is still a great way to find jobs in the data science field. It helps to connect with other professionals and make new relationships.
Use LinkedIn: Update your profile with the right keywords and interact with leaders in the field. Contact recruiters and show your interest.
Join Industry Events: Events like conferences, meetups, and webinars are great chances to network. Mark your calendar for the MLconf event in August and meet other attendees.
Set up Informational Interviews: Ask data professionals for informational interviews to learn about their jobs and gain helpful insights. An email introducing yourself and what you want can open new paths.
Conclusion
When you look for jobs in data science, it’s important to know where to find the best opportunities. Use key platforms, special job boards, and networking tips made for data scientists to make your job search better. A strong data science resume can help you stand out. Joining in on useful networking activities can also increase your chances of getting the job you want in this tough field. Don’t forget, staying persistent and taking action are very important for a rewarding career in data science. Good luck with your job search!
0 notes
edtechnews · 2 years ago
Text
Using AI to help more college students graduate
4 lessons from Dean Dara Byrne on how DataKind's AI model helped graduation rates at John Jay College, to be expanded to 6 more CUNY schools with Google.org support
0 notes
mourning-again-in-america · 2 years ago
Text
my shitpost theory of webdev is that every sufficiently advanced client/server library wants to be written in a Lisp, demonstrated by the one good decision Paul Graham ever made
also, having seen a very small view of things, it seems like languages that aren't lispy from the start like Ruby often develop similar traits to macros with kludges often arising from their dissimilarity from true macros (which may have only started to exist in Racket? Not sure)
Python has decorators in flask and I think fastapi, not sure what the hell Django's doing, it looks like it's wearing Python as a skinsuit
Haskell has a piss-poor "actual macro" system but most of the benefit of the advanced type system is the ability to replicate macro like features, like dynamic dispatch + sealed objects (DataKinds and closed type families) and complex adhoc inheritance-like behaviors (the deriving mechanism, where a library author writes some instances of a trait for a bunch of base objects (String, Int, etc.) and some instances for combinations of those objects) which appear to be not unlike C++ templates
These additions are used heavily in Haskell's best webshit library family, where you specify your routes wholly in the types and you get automatic generation of client functions (which, when called, act as if they hit the HTTP route, handy for testing without testing the networking stack) whenever you add a new route to your server and that's not even getting into property testing in Haskell, which imo is the true argument in favor of Haskell if you're not a compiler dev or otherwise oddly situated
there have been, no doubt, countless "advances" which have served only to delay or destroy progression of the development of software systems. but i should be grateful, for the following reads mostly like nonsense to me
34 notes · View notes
warninggraphiccontent · 5 years ago
Text
20 November 2020
Some personal news
Funny how whenever anyone says they have 'some personal news', it's always professional news, isn't it?
It was my turn yesterday. After more than seven years at the Institute for Government, I've decided it's time for a new challenge. I'll be going freelance from 1 January, so please get in touch if you'd like to work with me (all interesting projects on data, digital, openness, government, etc considered). I'll also be working on a book idea. Exciting, if also a little bit vaguely terrifying...
I'll still be an associate at the IfG, and still running our Data Bites event series. (Put Wednesday 2 December in your diary for the next one, and then Wednesday 3 February 2021 after that.)
And - rest assured! - I'll still be writing this newsletter, though it will be taking a break for much of December.
Just one more thing this week: I have a report out (with colleagues Marcus and Oliver) on digital government (in the UK) during the coronavirus crisis. There's a nice write-up from diginomica here, some nice quotes for the paperback from Tanya Filer and Tom Loosemore, and I'll be on the IfG podcast talking about it later. I'm a big fan of this chart, which tells the story of the crisis through visitors to GOV.UK.
This is the second of three reports on digital government we're publishing this autumn - the first was on policy making in a digital world, and the third, on future technology and the government workforce, will be out soon.
Have a great weekend
Gavin
Enjoying Warning: Graphic Content?
Tell your friends - forward this email, and they can:
Subscribe via email
Follow on Twitter
Follow on Tumblr
Or:
Buy me a coffee (thank you!)
Follow me, Gavin Freeguard, on Twitter
Visit my website
Today's links:
Graphic content
Viral content
3D Map of COVID Cases by Population, March through Today (r/dataisbeautiful)
Covid in the U.S.: Latest Map and Case Count* (New York Times)
Here’s a quick thread on what we changed and why (Charlie Smart)
Visualizing Covid-19 Deaths As Spheres in a Tank (r/dataisbeautiful)
Why rich countries are so vulnerable to covid-19* (The Economist)
COVID wriggling its way up county by county, subsiding, and then returning in six dimensions (Benjamin Schmidt)
States That Imposed Few Restrictions Now Have the Worst Outbreaks* (New York Times)
Coronavirus: Inside test-and-trace - how the 'world beater' went wrong (BBC News)
How have COVID-19 rates changed over the last week in England's small areas? (Carl Baker)
New York and the crisis in mass transit systems* (FT)
The kids aren’t alright: How Generation Covid is losing out* (FT)
The vaccines come of age
An effective covid-19 vaccine is a turning point in the pandemic* (The Economist)
Eurozone economy: the struggle to stay afloat until a vaccine arrives* (FT)
The world will soon have covid-19 vaccines. Will people have the jabs?* (The Economist)
Tracking the vaccine race (Reuters)
A man for Four Seasons
Even in Defeat, Trump Found New Voters Across the U.S.* (New York Times)
How Trump’s presidency turned off some Republicans – a visual guide (The Guardian)
Counties That Suffered Higher Unemployment Rates Voted for Biden* (New York Times)
In 2020, the critical must-win states were the 'blue wall'. By 2024, they could be Arizona, Georgia or even Texas (Aron)
Suburban turnout pushed Joe Biden to victory* (FT)
How Suburbs Swung the 2020 Election (Bloomberg CityLab)
Here’s how long it could take to certify the vote in key states — and the GOP efforts to upend that process* (Washington Post)
#dataviz
#ElectionViz: US TV networks have room for data storytelling improvement (Tableau)
Tracking the US election results: 'We needed to be clear, fast, and accurate' (The Guardian)
Sport and leisure
Lewis Hamilton's seventh F1 world title: The stats (BBC Sport)
How new swing techniques are revolutionising golf* (The Economist)
“The Queen’s Gambit” is right: young chess stars always usurp the old* (The Economist)
Who’s in the Crossword? (The Pudding)
Everything else
The State of Ageing in 2020 (Centre for Ageing Better)
Skyscrapers in London: Do we want to reach for the stars?* (The Times)
Atlas of Sustainable Development Goals 2020 (World Bank)
Boeing’s Max jet set to return just as customers head for exit* (FT)
Behind the tally, names and lives* (Washington Post)
Climate graphic of the week: Siberia experiences record temperatures* (FT)
Meta data
Viral content
Digital government during the coronavirus crisis (IfG)
What has digital government in the UK learned during the COVID-19 crisis? (diginomica)
Investigation into government procurement during the COVID-19 pandemic (NAO)
How DWP managed a surge in demand for Universal Credit during COVID-19 (diginomica)
Vaccine rumours debunked: Microchips, 'altered DNA' and more  (BBC News)
Health
Crowdsourcing Our NHS AI Lab Skunkworks Project (NHSX)
Working on a global mental health databank pilot (Wellcome Digital)
National project shows digital inclusion is key to tackling health inequalities (Good Things Foundation)
UK government
The National Data Strategy for Health and Care (and the other one for everything else) (medConfidential)
The UK National Data Strategy 2020: doing data ethically (ODI)
Matt Warman's speech on digital identity at Identity Week 2020 (DCMS)
Integrated Review (the Prime Minister)
The latest release of @ONSgeography's National Statistics UPRN Lookup links #UPRNs to postcodes (via Owen Boswarva)
The Document Checking Service: trialling online passport validity checks (Government Digital Service)
A possible expansion of FOIA... (via George Greenwood)
Taiwan
How Taiwan beat Covid-19* (Wired)
Taiwan’s civic tech gift to the world (GovInsider)
How Taiwan became a coronavirus success story (IfG, from June 2020)
North America
Data disappeared: four years of the Trump administration (Highline)
How the U.S. Military Buys Location Data from Ordinary Apps (Motherboard)
The federal government’s chief information security officer is helping an outside effort to hunt for alleged voter fraud* (Washington Post, via Alice)
Zuckerberg and Dorsey to be quizzed by Senate following Biden vote victory (BBC News)
Companies could face hefty fines under new Canadian privacy law (CBC)
Analysis (Cory Doctorow)
Everything else
Announcing the Data Collective: Free training, consultancy, peer support, and community for those using data in the social sector (DataKind UK)
Increasingly trusted to find an edge: What it’s like to be a club’s data analyst* (The Athletic)
Design is the strategy* (Apolitical)
An adequacy determination does not resolve the lower standard of data protection in the UK. (Amberhawk)
The TBI Globalism Study: Transparency and Autonomy Should Underpin Online Voting Systems (Tony Blair Institute for Global Change)
The Data Governance Working Group of the Global Partnership for AI is seeking feedback on how we're thinking about our work and scope (via Jeni)
Opportunities
JOB: Director, GOV.UK (GDS)
JOB: Head of Technology and Architecture for GOV.UK opening at GDS (Technology in Government)
COURSE: ANNOUNCING: First of its Kind Executive Course on Data Stewardship — Focused on Data Re-Use in the Public Interest (Open Data Policy Lab)
EVENT: How to enhance the UK’s geospatial ecosystem (Geospatial Commission)
EVENT: UKGovCamp 2021
And finally...
The R number, crocheted. (Statistrikk)
The Civic Tech Graveyard
AI can now produce passable parody song lyrics. The system is called Weird AI Yankovic. Really. (Pando)
1 note · View note
digitfyi · 8 years ago
Text
Data Science for Democratic Freedoms
Ahead of Data Fest 2018 DataKind's Erin Akred Discusses Data Science for Democratic Freedoms
While there have been tremendous advances in technology and more data than ever now available, for-profit companies tend to be the only entities with the necessary budget and staff to take advantage. What if the same algorithms that companies use to boost profits or recommend products could instead be applied to some of the toughest challenges the world faces today?
Whether it’s predicting and…
View On WordPress
0 notes
todayispythonday-blog · 8 years ago
Text
070917 - Github
soooo today i am getting around to cleaning up my Github! in a terrible mess really. what prompted me to do so was because yesterday i went for a data gathering sort of stuff, gave my Github username to get access, and i was sooo embarrassed at how messy and unprofessional it was!
in any case, i have REMEDIED THAT TODAY hah. also getting my shit together so that i can get around to building up my portfolio.
it’s going to be a long journey, but baby steps! nobody ever accomplished anything great in one day, right? ...right?
---
also since I have managed to scrap by all the pre-course required for GA, i am busying myself with some SQL in the meantime, apparently it’s pretty important and useful when using it in Pandas in Python anyways.
maybe someday when i get my head around the basics of data cleaning, exploratory data analysis (EDA) properly, that’s when i can progress to some machine learning... i look forward to that day LOL.
in the meantime, if anyone has any handy tips on data cleaning and EDA please drop me a note and i would be eternally grateful.
---
in other news, the guys from DataKind are doing some pretty cool work here in Singapore - check them out!
https://www.meetup.com/DataKind-SG/
0 notes
watfordseo · 7 years ago
Link
via Twitter https://twitter.com/TheWatfordSEO
1 note · View note
todayinanalytics · 3 years ago
Text
Favorite tweets
“Having worked in campaigns, policy, and research on various issues, I recognise the crucial role data can play in creating positive social change” A big welcome to our new Team Coordinator, Amanda!🎉 Find out more about her https://t.co/Wd99UEXJMT #welcome #data4good #newhire
— DataKind UK (@DataKindUK) Jun 1, 2022
from http://twitter.com/DataKindUK via IFTTT
0 notes
seaofbitternessandsorrow · 4 years ago
Photo
Tumblr media
#datadrivesimpact In collaboration with not-for-profit company DataKind, John Jay College had the ability to utilize predictive analytics to recognize trainees at threat of leaving and step in with resources to support them, leading to a remarkable enhancement in their graduation rates. Discover more at www.data.org.
0 notes
datakind · 4 years ago
Video
vimeo
2021-04-07-ER-Discovery-Day.mov from DataKind on Vimeo.
0 notes
newslookout · 5 years ago
Text
Tech Equity and Serving the Community | Afua Bruce | TEDxFoggyBottom
Tumblr media
youtube
Afua Bruce, a public interest technologist, introduces her experience in an up-and-coming industry that, "allows us to not just imagine a more equitable world, but to actively build justice into the systems that drive our world." Afua Bruce has spent her career developing and applying technology for the public interest. She is currently the Chief Program Officer for DataKind, a nonprofit that harnesses the power of data science in service of humanity. Afua was previously the Director of Engineering for New America’s Public Interest Technology program. There, Afua oversaw the Public Interest Technology University Network, as well as technologists working with state and local government, and NGOs, to develop technology and policy better together. Prior to New America, Afua served in the Federal government, as the Executive Director of the White House’s National Science and Technology Council and in strategy positions in the FBI’s science and technology programs. Afua began her career as a software engineer at IBM. Afua holds a degree in computer engineering from Purdue University and an MBA from the University of Michigan. This talk was given at a TEDx event using the TED conference format but independently organized by a local community. Learn more at https://www.ted.com/tedx
The post Tech Equity and Serving the Community | Afua Bruce | TEDxFoggyBottom appeared first on News Lookout.
source https://newslookout.com/inspire/tech-equity-and-serving-the-community-afua-bruce-tedxfoggybottom/
0 notes
snak · 2 years ago
Text
Working with record fields in Haskell 2023
There are several ways to work with record fields in Haskell. The most traditional way is to use the ordinal pattern match and record field selectors.
{-# LANGUAGE RecordWildCards #-} data Person = Person { name :: Name, age :: Int } deriving (Show) data Name = Name { first :: String, last :: String } deriving (Show) person :: Person person = Person (Name "Tiger" "Scott") 10 f1, f2, f3, f4 :: String Person {name = Name {first = f1}} = person f2 = first $ name person f3 = (\Person {name = Name {first}} -> first) person f4 = (\Person {name = Name {..}, ..} -> first) person a1, a2, a3, a4 :: Int Person {age = a1} = person a2 = age person a3 = (\Person {age} -> age) person a4 = (\Person {..} -> age) person p1, p2 :: Person p1 = person { name = (name person) { first = "Micheal" } } p2 = person {age = 11}
As you can see, getting fields is simple and even simpler with NamedFieldPuns and RecordWildCards.
But it suddenly gets rather complex when it comes to updating nested fields.
Note that you can stop generating field selectors with NoFieldSelectors. This extension can be useful especially with DuplicateRecordFields, which allows you to have multiple fields with the same name in your module.
There is another way to get field values, which is HasField. Using HasField, you can access a field with a type-level string.
{-# LANGUAGE DataKinds #-} import GHC.Records data Person = Person { name :: Name, age :: Int } deriving (Show) data Name = Name { first :: String, last :: String } deriving (Show) person :: Person person = Person (Name "Tiger" "Scott") 10 f :: String f = getField @"first" $ getField @"name" person a :: Int a = getField @"age" person
Unfortunately, HasField doesn't support updating field values at this moment.
OverloadedRecordDot is another extension which allows you to access a field just like OO languages.
{-# LANGUAGE OverloadedRecordDot #-} data Person = Person { name :: Name, age :: Int } deriving (Show) data Name = Name { first :: String, last :: String } deriving (Show) person :: Person person = Person (Name "Tiger" "Scott") 10 f :: String f = person.name.first a :: Int a = person.age -- We need to implement setField by ourselves and enable OverloadedRecordUpdate to use these. {- p1, p2 :: Person p1 = person {name.first = "Micheal"} p2 = person {age = 11} -}
As you can see, you can concatenate field names with . to access it. Note that it uses HasField under the hood.
There is another extension named OverloadedRecordUpdate, but you need to implement HasField type class by yourself to make it work.
The next option is lens (or other lens package). Even though the power of lens is far stronger than just accessing record fields, it works well just to get and set field values.
{-# LANGUAGE TemplateHaskell #-} import Control.Lens data Person = Person { _name :: Name, _age :: Int } deriving (Show) data Name = Name { _first :: String, _last :: String } deriving (Show) makeLenses ''Person makeLenses ''Name person :: Person person = Person (Name "Tiger" "Scott") 10 f :: String f = person ^. name . first a :: Int a = person ^. age p1, p2 :: Person p1 = person & name . first .~ "Micheal" p2 = person & age .~ 11
While lens package uses Template Haskell to generate lenses such as name, age, first, and so on, generic-lens package uses GHC.Generics to generate lenses.
{-# LANGUAGE DataKinds #-} import Control.Lens import Data.Generics.Product import GHC.Generics data Person = Person { name :: Name, age :: Int } deriving (Show, Generic) data Name = Name { first :: String, last :: String } deriving (Show, Generic) person :: Person person = Person (Name "Tiger" "Scott") 10 f :: String f = person ^. field @"name" . field @"first" a :: Int a = person ^. field @"age" p1, p2 :: Person p1 = person & field @"name" . field @"first" .~ "Micheal" p2 = person & field @"age" .~ 11
As you can see it uses a type-level string with field to specify a field instead of a generated lens.
0 notes
nycdoitt · 8 years ago
Text
NYC Open Data Week is back: March 3-10, 2018
Enjoy chatting with the NYC Open Data Team? Or...perhaps you’re merely Open-Data-Curious?
To raise awareness about NYC Open Data—a free data resource!—last year the NYC Open Data Team partnered with the civic technology community to produce Open Data Week 2017, which engaged over 900 New Yorkers! We’re now asking for submissions of ideas for Open Data Week 2018 and hope you’ll share an idea. The deadline for submissions is December 15th. Email: [email protected] with any questions, you can also learn more via our coverage in StateScoop a few weeks ago!
For inspiration, here are some great sessions and concepts from last year:
Data Jam / Hackathon
DataKind, 92Y and Bill & Melinda Gates Foundation: Giving Tuesday DataDive.
Two day event engaging data scientists to help brainstorm ways to increase the impact of #GivingTuesday
Tumblr media
Panel Discussion
General Assembly Panel Discussion: Data and…Health.
General Assembly brought together a panel of experts and influencers from the health and wellness spaces to discuss how big data is impacting their organizations.
Tumblr media
Workshop
NYC Parks Computer Resource Centers Open Data for All: TreesCount! Workshop.
Free workshop presented by NYC Parks and the NYC Open Data team offered a broad introduction to the NYC Open Data Portal along with the concept of data literacy and analysis using NYC TressCount! Data, which is the most accurate map of NYC’s street trees ever created.
Tumblr media
NYC Big Apps: NYC Open Data Portal & Department of City Planning Facilities Explorer Tutorial
Free workshop for NYC Big Apps participants to learn about these tools as they work to build out solutions as a part of the annual Big Apps competition.
Dinner Salon
Prime Produce & Startup M/IG: Open Data Dinner Salon.
Prime Produce and Startup M/IG teamed up to host a dinner discussion introducing their communities in a process of prototyping what a more aesthetically influenced public policy would look like for NYC.
Happy Hour
Reaktor Open Data Studio.
Reaktor hosted a happy hour to share ideas about how Open Data could be utilized in new ways.
Interactive Breakfast
Made in NY Media Center + Fabernovel Data & Media: Open Data Breakfast.
Made in NY Media Center teamed up with FaberNovel to host an interactive breakfast for developers, agency and civil service non-profits to explore using open data to build products and conduct research & analysis to create new applications.
Tumblr media
Summit
Department of Small Business Services: 2017 Smart Districts Summit.
Inaugural NYC Smart Districts Summit, where community and technology leaders collaboratively explored how emerging technologies are being leveraged to address the most pressing district-level challenges. 
Tumblr media
BetaNYC School of Data
Community conference showcasing NYC’s civic design, civic/government technology and open data ecosystem.
Tumblr media
Live Demonstration
Civic Hall Presents: Open Data, Mapping Global Security & the Department of Defense
Civic Hall teamed up with the National Geospatial-Intelligence Agency (NGA) to present a demo of its geospatial data portals to address how we can get national security data into the open.
Tumblr media
Civic Hall Presents: Demo of CALC Tool with 18F
18F is a digital innovation team in the federal US government; CALC is a tool developed by the team to assist contracting officers and contracting specialists in making informed decisions through market research and price analysis for labor categories on federal government contracts.
College of Staten Island (CSI) Tech Incubator + Vizalytics: Data – A Driving Force of Innovation
Vizalytics and the NYC Open Data Team demo’d their platforms then took questions from the audience.
5 notes · View notes
digitalworldmania · 6 years ago
Text
5 Unknown Info You Should Know About Speech Analytics
The software program-as-a-service (SaaS) model is disrupting traditional approaches to business analytics.  When not at DataKind, you can find him instructing know-how techniques and enterprise analytics at New York University. Studying knowledge science is the finest recommended to prepare oneself for the changing business state of affairs. It could include reporting and implementation methodologies round machine studying, artificial intelligence, predictive and prescriptive analytics at scale.
Data Scientist course
The course covers plenty of subjects and instruments that act as physique and soul in database management like fundamental Statistics, Speculation Testing, Knowledge Mining and Clearing, Machine Learning, Knowledge Forecasting, Knowledge Visualization, Programming Languages like Matlab, C++, Hadoop, Plotting Libraries Like Python, Plotly, Matplotlib, and many others.
The MSc in business analytics has greatly enhanced my analytical, programming and research skills, which enabled me to save a place as a technology Business Analyst, Citi Group. There are a lot of digital advertising instruments out there online to help you with analytics. It does an excellent job of performing descriptive analytics on restricted amounts of data.
Programs in subjects similar to Cellular Security and Analytics, Business Analytics Programming, Enterprise Analytics Modeling, Enterprise Course of Administration, Healthcare Analytics, and Cyber Security Analytics put together you for careers in various industries and organizations. While that might come within the form of offering some advanced analytic methods to our active merchants or possibly to provide more personalized expertise for our retail prospects and even just to simply enhance the method efficiencies for our institutional aspect of the business, which is our advisors.
With so much readily available to debate, I bought to the crux of the matter straight away and asked Amit in regards to the pillars he considered to be essential for a buyer worth management program being pushed by analytics. Enroll right now in the ExcelR Solutions Enterprise Analytics course. There are lots of corporations that have their product be knowledge scientific or algorithmic, but a big portion of data science carried out within an enterprise context is definitely done in service of the business, quite than packaged as its personal product.
The shortage of expert professionals (information analysts, enterprise analysts, data scientists) is one other problem for any organization irrespective of the size and nature of the business entity. The course structure is constructed by the know-how consultants that may help in facilitating professionalism in students and also additional down the road, the Business Analytics coaching program will assist them to obtain their purpose and to get placed in MNC and Huge corporations.
Enterprise Analytics is creating countless jobs in all domains throughout the globe. The course price for Business Analyst certification online is dependent upon the kind of coaching institute, location in Bangalore, sort after all and mode of training. Transformation of the standard supply chain comes about with the integration of emerging applied sciences like blockchain, Internet of Things, robotic process automation, huge knowledge, synthetic intelligence, and predictive analytics.
We are located at :
         ExcelR - Data Science, Data Analytics
         Course Training in Bangalore
         49, 1st Cross, 27th Main,
         Behind Tata Motors, 1st Stage,
         BTM Layout, Bengaluru,
         Karnataka 560068
         Phone: 096321 56744
         Hours: Sunday - Saturday 7AM - 11PM
Click here to check our live location: Data Scientist course
Check out our Data Science Interview Questions
0 notes