#How To Define Validation Rules In Salesforce
Explore tagged Tumblr posts
Text
Cloning Custom Objects in Salesforce Is a Pain — Here’s How migSO Makes It Easy
Let’s be honest — moving custom objects from one Salesforce org to another can be a real headache. Salesforce doesn’t give us a direct way to do it, and doing it manually? That can be a total time sink, not to mention error-prone.
That’s exactly why we built migSO — a native Salesforce app that makes cloning custom objects (and other metadata) super simple and stress-free.
First, What Exactly Is a Custom Object?
If you’re using Salesforce, you’ve probably worked with custom objects — these are basically user-defined data containers that store business-specific info Salesforce doesn't offer out of the box.
You can create custom fields, validation rules, record types, field sets — all tailored to your unique process. The only problem? Moving them between orgs (say from Sandbox to Production) isn't as smooth as it should be.
That’s Where migSO Comes In
migSO helps you clone, export, and manage metadata across Salesforce orgs — all within a friendly interface. No more tedious, step-by-step manual work. With a few clicks, you can clone multiple custom objects from one org to another, without worrying about missing pieces or breaking things.
How to Clone Custom Objects Using migSO (It’s Easier Than You Think)
Here’s how the process works once you’ve installed migSO:
Open migSO from the App Launcher
Click on Clone Wizard
Choose the type of metadata you want to move (like Custom Objects)
Select your Source Org
Pick the Objects you want to clone and hit Deploy
Check the Deployment Status to make sure everything went through
Set Field Permissions if needed
And finally, Download a report of what was deployed
Yup, that’s it. It really is that simple. No stress, no code, and no chasing down missing components.
Want to Track What You’ve Deployed?
migSO keeps track of everything you do through the Clone Wizard Logs.
You can:
View a detailed log of all your deployments
Click on any object you’ve deployed to get the full details
Download an Excel report from the Related tab — great for documentation or team handoffs
Why People Love Using migSO
If you’ve ever clone metadata manually, you’ll understand why migSO feels like a game-changer. Here’s what makes it awesome:
✅ Native to Salesforce — no extra tools needed
✅ Easy mass cloning of custom objects
✅ Clean, simple interface
✅ Transparent logging and reporting
✅ Saves hours (if not days) of manual work
Tumblr media
A Little About Us
We’re Tech9logy Creators, a certified Salesforce Consulting and ISV Partner with over a decade of experience. We’ve built apps for the AppExchange and helped hundreds of businesses get more out of their CRM.
Our goal? To make your Salesforce experience as smooth, productive, and pain-free as possible.If you’re tired of the hassle of cloning custom objects, give migSO a try — and let your team focus on what really matters.
0 notes
codezix · 30 days ago
Text
Migrating from HubSpot to Salesforce: What to Know
Tumblr media
As businesses grow, so do their technology needs. While HubSpot is an excellent platform for small to medium-sized businesses, many organizations eventually outgrow its capabilities and look for a more scalable, customizable, and enterprise-level solution. That’s where Salesforce comes in.
Migrating from HubSpot to Salesforce is a significant step—one that can supercharge your sales, marketing, and service operations when executed properly. However, it also involves careful planning, data handling, system integration, and change management. If you're considering this transition, this guide will walk you through what to expect, how to prepare, and why working with a Salesforce consultant in Sydney can help ensure success.
Why Migrate from HubSpot to Salesforce?
Before diving into the how, let’s explore the why. HubSpot and Salesforce both offer powerful CRM capabilities, but they cater to different levels of business complexity.
Key reasons companies migrate:
Scalability: Salesforce is ideal for growing companies that need to manage complex workflows, large teams, and multiple departments.
Customization: Salesforce provides robust customization options through Apex (its proprietary coding language), custom objects, and Lightning components.
Advanced Reporting: Salesforce offers more powerful analytics and real-time reporting compared to HubSpot.
Enterprise Integrations: Salesforce integrates with a wider range of third-party and enterprise-grade systems.
Specialised Industry Support: Salesforce is built to support highly regulated industries like healthcare, finance, and government.
For companies in Australia’s growing tech and enterprise sectors, particularly those headquartered in or expanding within New South Wales, partnering with a Salesforce consulting partner in Sydney makes perfect sense when preparing for such a strategic move.
Step-by-Step Guide to Migrating from HubSpot to Salesforce
1. Define Your Goals
Start with clarity. What are you hoping to achieve by moving to Salesforce? Common goals include better data visibility, enhanced automation, deeper integrations, or support for more users and territories.
Working with Salesforce consultants in Sydney early in the planning phase can help you set measurable goals aligned with your business strategy.
2. Audit and Clean Your Data
Your CRM is only as good as the data in it. A data audit involves:
Identifying what data you currently use in HubSpot (contacts, companies, deals, tickets, custom fields).
Cleaning up duplicates and outdated records.
Mapping fields to Salesforce equivalents.
This is a crucial step where a Salesforce developer in Sydney can provide technical assistance in creating mapping documents, data transformation scripts, and validation rules.
3. Plan the Migration Strategy
There are multiple ways to migrate data from HubSpot to Salesforce, depending on the size and complexity of your CRM:
Manual Export/Import: Suitable for small businesses with basic CRM data.
Third-party Tools: Platforms like Data Loader, MuleSoft, and HubSpot-Salesforce integration tools offer semi-automated migration options.
Custom Scripts and APIs: For large-scale migrations with custom objects or workflows.
The choice of strategy should align with your business model and data architecture. This is where Salesforce consulting in Sydney becomes invaluable—they can recommend and execute the most effective method.
4. Rebuild Workflows and Automations
HubSpot workflows do not automatically transfer to Salesforce. You’ll need to recreate:
Lead nurturing sequences
Sales assignment rules
Email automations
Task triggers
Using Salesforce's Flow Builder, Process Builder, or Apex Triggers, a Salesforce developer in Sydney can rebuild these automations, often improving them with more sophisticated logic and scalability.
5. Integrate Third-Party Apps
Salesforce integrates with thousands of tools via its AppExchange, but the process requires careful handling to avoid conflicts or data silos.
Apps you may need to reintegrate include:
Email marketing (e.g., Mailchimp, ActiveCampaign)
Customer support platforms (e.g., Zendesk, Intercom)
E-commerce platforms (e.g., Shopify, Magento)
Accounting software (e.g., Xero, QuickBooks)
Many Sydney businesses also work with local or region-specific systems. Partnering with a Salesforce consulting partner in Sydney ensures seamless integration with both global and local tools.
\
6. Train Your Team
Salesforce is a more robust and sometimes more complex system than HubSpot. Training your team ensures high adoption rates and fewer errors post-migration.
A Salesforce consultant in Sydney can deliver tailored training sessions based on user roles—sales reps, marketers, administrators—ensuring everyone is comfortable using the platform from day one.
\
7. Test Everything
Testing should include:
Data integrity: Are records correctly migrated?
Field mapping: Are fields showing as expected?
Workflow functionality: Do automations run correctly?
User permissions: Are access levels appropriately set?
Involving Salesforce consultants in Sydney during testing ensures that no detail is overlooked, reducing the risk of go-live hiccups.
\
8. Go Live and Monitor
Once you’ve tested and signed off, it’s time to go live. But your work doesn’t end there. For the first few weeks post-migration:
Monitor system performance.
Track user engagement and errors.
Provide ongoing support.
Having a local Salesforce consulting partner in Sydney on standby ensures quick resolution of any post-launch issues, minimizing disruption to business operations.
Common Challenges in Migrating from HubSpot to Salesforce
Even well-planned migrations come with challenges. Here are a few common ones—and how to solve them:
1. Inconsistent Data
Different field structures and naming conventions can cause import errors. Working with a Salesforce developer in Sydney to create clean data mapping solves this.
2. Feature Mismatches
Some HubSpot features don’t have direct Salesforce equivalents. For example, HubSpot’s contact lifecycle stages must be manually replicated with Salesforce fields and logic.
This is where custom development or using Salesforce’s flexible architecture comes in—areas where Salesforce consultants in Sydney shine.
3. Team Resistance
Users comfortable with HubSpot may resist change. This is addressed through proactive change management and role-based training.
Benefits of a Successful Migration
A properly executed migration from HubSpot to Salesforce delivers real, tangible benefits:
Improved Reporting: Real-time dashboards with deeper insights.
Stronger Integrations: Seamless syncing across your business apps.
Greater Flexibility: Custom objects, workflows, and layouts.
Enterprise-Level Control: Better security, permissions, and scalability.
Future-Proofing: Salesforce evolves continuously, supporting long-term growth.
These benefits are maximized when the migration is led by professionals, particularly a trusted Salesforce consulting partner in Sydney who understands your business landscape.
Final Thoughts: Why Work with a Salesforce Partner in Sydney?
Migrating from HubSpot to Salesforce is more than just a data transfer—it’s a digital transformation. To get it right, you need both strategic insight and technical expertise.
Here’s why choosing a local partner matters:
Proximity: In-person workshops, training, and support.
Industry Insight: Local consultants understand Australian business regulations and industry nuances.
Speed & Responsiveness: Being in the same time zone ensures fast response and collaboration.
From SMBs to enterprise companies, businesses are increasingly turning to Salesforce consultants in Sydney to handle complex migrations and unlock Salesforce’s full potential.
Ready to make the move from HubSpot to Salesforce? Let a qualified Salesforce consultant in Sydney help you plan, execute, and optimize your migration journey for success.
0 notes
manrastechnology · 6 months ago
Text
A Comprehensive Guide to Field Dependencies in Salesforce
Tumblr media
Salesforce provides essential tools to optimize your processes and ensure seamless workflows. By leveraging these features, you can maintain accuracy across your data. In this article, we’ll discuss how to set up Salesforce field dependencies, implement Salesforce validation rules, and use Salesforce formula fields to enhance Salesforce data quality and ensure Salesforce data integrity across your organization.
Understanding Field Dependency
Field dependency in Salesforce is a feature that allows you to filter and display relevant options in one picklist based on the user’s selection in another picklist. It involves linking a controlling field and a dependent field, where the controlling field determines the values shown in the dependent field. This ensures Salesforce data integrity by preventing users from selecting irrelevant options, helping to maintain accurate and consistent data.
Benefits of Field Dependencies
Now that you know what Salesforce field dependencies are, here are some benefits that make them a valuable tool in Salesforce:
Cleaner and Consistent Data
Field dependencies ensure that users only see relevant options, making data entry more accurate and organized and simplifying reporting and analysis.
Streamlined Workflows
By hiding irrelevant choices, field dependencies reduce confusion, making complex workflows more straightforward to manage for users.
Better User Experience
With guided options, users can input data faster and with fewer mistakes, leading to a smoother and more efficient experience.
Versatility Across Objects
Field dependencies work with various objects in Salesforce, making them adaptable to different business needs without requiring extra validation rules.
Popular Use Cases
Apart from the benefits, let’s explore some popular use cases where field dependencies simplify workflows and enhance accuracy:
Region-Specific Choices
Field dependencies filter options based on geographical data for businesses operating across multiple locations. For example, when a user selects a country like “United States,” the dependent field will show only relevant states, ensuring accurate data entry. The State and Country Picklists feature for further standardization can also support this.
Product Selection
For companies with diverse product lines, field dependencies streamline the selection process. When a user selects a product category, such as “Electronics,” the dependent field will display only relevant products like “Smartphones” or “Laptops,” ensuring a smoother, more accurate choice.
Industry-Specific Fields
Field dependencies are also useful in industries with specialized requirements, like healthcare or manufacturing. For example, selecting “Surgery” in a hospital’s system could trigger a list of relevant equipment, ensuring users only see valid options.
How to Set Up Field Dependencies
Here are the key steps to set up field dependencies in Salesforce:
Ensure You Have the Right Permissions
Before setting up field dependencies, confirm that you have the “Customize Application” user permission. This permission allows you to define and edit dependent picklists, which is crucial for customizing field interactions in Salesforce.
Check the Fields Involved
Review the fields you want to use in your field dependency. The controlling field can be a standard or custom picklist, checkbox, or similar field type. The dependent field, however, must be a custom picklist. Ensure that both fields have all the necessary values to create the dependency.
Navigate to Field Dependencies
In Salesforce, go to Setup → Object Manager, and select the object (such as Case or Contact) where the field dependency will be created. From there, click on Fields & Relationships, and then select Field Dependencies to start the setup process.
Create a New Field Dependency
Click on the New button to create a new field dependency. Then, choose the controlling field and the dependent field from the dropdown lists and click Continue to move forward with configuring the dependency.
Set Up the Dependency Matrix
The dependency setup involves a matrix where you link values from the controlling field to the corresponding options in the dependent field. For each choice in the controlling field, select which values in the dependent field should be displayed. You can add or remove values by clicking on them and using the “Include” or “Exclude” buttons.
Preview the Dependency
After configuring your dependency, click Preview to see how the fields will behave. This step allows you to test and ensure that the dependent field displays only the appropriate options based on the controlling field’s selection.
Save Your Changes
Once you’re satisfied with how the field dependencies work, click Save to finalize the setup. This will lock in the changes and apply the dependency across your Salesforce system.
Read More: https://www.manras.com/a-comprehensive-guide-to-field-dependencies-in-salesforce/
0 notes
techforce-services · 11 months ago
Text
Innovative DevOps Approaches to Infrastructure as Code
In the evolving landscape of software development, Infrastructure as Code (IaC) has emerged as a transformative approach to managing and provisioning IT infrastructure. By treating infrastructure configurations as code, organizations can achieve consistency, scalability, and efficiency. This article explores innovative DevOps approaches to IaC, highlighting how Salesforce DevOps, including Salesforce DevOps tools and the Salesforce DevOps Center, can enhance IaC practices within the Salesforce ecosystem.
Understanding Infrastructure as Code
Infrastructure as Code is a DevOps practice that involves managing and provisioning computing resources through machine-readable definition files rather than physical hardware configuration or interactive configuration tools. IaC allows infrastructure to be versioned and treated as software code, enabling automation, consistency, and repeatability in infrastructure management.
Benefits of IaC in DevOps
Consistency and Reliability: IaC ensures that the same configuration is applied consistently across multiple environments, reducing the risk of configuration drift and manual errors.
Speed and Efficiency: Automated infrastructure provisioning speeds up the deployment process, allowing teams to spin up environments quickly and efficiently.
Scalability: IaC makes it easier to scale infrastructure up or down based on demand, ensuring optimal resource utilization.
Version Control: Treating infrastructure as code allows for versioning and rollback capabilities, making it easier to track changes and revert to previous configurations if needed.
Innovative DevOps Approaches to IaC
1. Modularization and Reusability
Modularization involves breaking down infrastructure code into reusable components or modules. This approach promotes code reuse and simplifies infrastructure management. For instance, creating modules for common infrastructure components such as virtual networks, storage accounts, and compute instances allows teams to reuse these modules across multiple projects. Tools like Terraform and AWS CloudFormation support modularization, enabling teams to build and maintain scalable and reusable infrastructure templates.
2. Policy as Code
Policy as Code is an innovative approach that integrates compliance and security policies into the IaC process. By defining policies as code, organizations can automate compliance checks and ensure that infrastructure adheres to security standards. Tools like HashiCorp Sentinel and Open Policy Agent (OPA) allow teams to enforce policies programmatically, ensuring that infrastructure deployments meet regulatory and security requirements.
3. GitOps
GitOps is a DevOps methodology that uses Git as a single source of truth for declarative infrastructure and applications. By storing IaC definitions in a Git repository, teams can leverage Git workflows for version control, collaboration, and automated deployments. When changes are committed to the repository, automation tools like ArgoCD or Flux can synchronize the desired state with the actual state of the infrastructure, ensuring consistency and reliability.
4. Automated Testing and Validation
Incorporating automated testing and validation into the IaC pipeline ensures that infrastructure code is thoroughly tested before deployment. Tools like Terraform Validate, AWS Config Rules, and Chef InSpec allow teams to write tests for infrastructure configurations, validating that they meet predefined criteria and standards. Automated testing reduces the risk of errors and ensures that infrastructure deployments are robust and reliable.
5. Immutable Infrastructure
Immutable infrastructure is an approach where infrastructure components are replaced rather than updated. This ensures that every deployment results in a fresh, clean environment, eliminating configuration drift and reducing the risk of inconsistencies. Immutable infrastructure can be achieved using containerization tools like Docker and orchestration platforms like Kubernetes, which facilitate the deployment of immutable containerized applications.
6. Integration with Salesforce DevOps
For organizations operating within the Salesforce ecosystem, integrating IaC practices with Salesforce DevOps can significantly enhance infrastructure management. Salesforce DevOps tools, including the Salesforce DevOps Center, provide capabilities for version control, automated testing, and continuous integration. By leveraging these tools, teams can manage Salesforce infrastructure as code, ensuring consistency, scalability, and efficiency.
7. Continuous Monitoring and Feedback
Implementing continuous monitoring and feedback mechanisms is crucial for maintaining the health and performance of infrastructure. Tools like Prometheus, Grafana, and Datadog offer real-time monitoring and alerting capabilities, enabling teams to detect and address issues proactively. Integrating these monitoring tools with IaC workflows ensures that infrastructure changes are continuously monitored, and any anomalies are promptly identified and resolved.
Conclusion
Innovative DevOps approaches to Infrastructure as Code are transforming the way organizations manage and provision their IT infrastructure. By adopting practices such as modularization, policy as code, GitOps, automated testing, immutable infrastructure, and continuous monitoring, teams can achieve consistency, scalability, and efficiency in their infrastructure management. Leveraging Salesforce DevOps tools and the Salesforce DevOps Center within the Salesforce ecosystem further enhances these capabilities, enabling seamless integration and management of Salesforce infrastructure as code. Embrace these innovative approaches to IaC to drive efficiency, reliability, and agility in your organization's infrastructure management.
0 notes
kirankumar166 · 1 year ago
Text
Dell Boomi Salesforce Integration Guide
Tumblr media
Dell Boomi Salesforce Integration Guide: A Seamless Connection
Dell Boomi is a powerful iPaaS (Integration Platform as a Service) solution renowned for connecting applications, data sources, and systems within the cloud and on-premises. As a leading CRM platform, Salesforce is a natural target for Boomi integrations. In this guide, we’ll walk you through integrating Dell Boomi with Salesforce to streamline your business processes.
Why Integrate Dell Boomi and Salesforce?
360-Degree Customer View: Integrate Salesforce with your ERP, accounting systems, and other sources to gain a complete picture of your customers.
Process Automation: Automate tasks such as lead creation in Salesforce from other systems or update orders in your ERP based on Salesforce opportunities.
Data Synchronization: Ensure data consistency across your systems, eliminating manual data entry and reducing errors.
Real-Time Updates: Enable real-time or near real-time data flow, enhancing business agility and responsiveness to customer needs.
Step-by-Step Guide
Set up Your Dell Boomi Account:
If you don’t have one already, create a Dell Boomi account.
Please familiarize yourself with the Boomi interface and its core components (processes, connectors, maps).
Salesforce Connector Configuration:
Obtain your Salesforce security token and other credentials (username, password).
In Boomi, add the Salesforce connector to your process.
Configure the connector with your Salesforce credentials and select the API version.
Select Salesforce Operations:
Determine what data to exchange between Salesforce and other systems.
Data Mapping:
Define how data fields from your source system map to the corresponding fields in Salesforce or vice versa.
Boomi’s drag-and-drop interface simplifies this process.
Consider data transformations or cleansing if necessary.
Error Handling and Validation:
Include logic for handling potential errors (e.g., invalid data, Salesforce API limits).
Implement data validation rules to ensure data integrity.
Testing:
Thoroughly test your integration in a test environment before deploying to production.
Debug and troubleshoot any issues that arise.
Deployment:
Deploy your integration process to a Boomi Atom (runtime environment).
Consider deploying to a test atom first, then to your production atom.
Monitoring:
Use the Boomi dashboard to monitor your integration and track data flow.
Set up alerts to notify you of any issues.
Best Practices
Start Simple: Begin with a small integration project to gain experience before tackling more complex ones.
Utilize Reference Fields: Use Salesforce reference fields in Boomi maps to avoid querying for Salesforce IDs, enhancing performance.
Consider Data Volumes: Process data in batches to optimize performance for large data sets.
Leverage the Boomi Community: Take advantage of the extensive knowledge base for support and best practices.
Conclusion
By following this Dell Boomi Salesforce integration guide, you can unlock the power of seamless connectivity between your crucial business systems. Boomi’s intuitive interface and robust capabilities make it a top choice for businesses seeking to streamline operations and enhance their data-driven decision-making.
youtube
You can find more information about Dell Boomi in this  Dell Boomi Link
 
Conclusion:
Unogeeks is the No.1 IT Training Institute for Dell Boomi Training. Anyone Disagree? Please drop in a comment
You can check out our other latest blogs on  Dell Boomi here – Dell Boomi Blogs
You can check out our Best In Class Dell Boomi Details here – Dell Boomi Training
Follow & Connect with us:
———————————-
For Training inquiries:
Call/Whatsapp: +91 73960 33555
Mail us at: [email protected]
Our Website ➜ https://unogeeks.com
Follow us:
Instagram: https://www.instagram.com/unogeeks
Facebook: https://www.facebook.com/UnogeeksSoftwareTrainingInstitute
Twitter: https://twitter.com/unogeek
0 notes
quantoknack2 · 3 years ago
Text
Tumblr media
How To Define Validation Rules In Salesforce
With so many changes made to Salesforce records on a regular basis by employees, it may be challenging to keep track of the quality of the input data. Salesforce uses Validation Rules to protect users from making mistakes (such as entering phone numbers in the wrong format), establishing verification boundaries, and standardizing data requirements in general.
0 notes
champprivacy · 5 years ago
Text
GDPR Compliant Cookie Solution - What it means?
Cookie Law as passed in 2009 got a new enforcement life after GDPR. Court of Justice of the European Union in the Planet49 case ruled that storing cookies required active consent(GDPR standard). Following the judgment Data Protection of Authorities of Ireland, Germany, Spain, and others have started enforcement actions against websites that do not have a GDPR compliant cookie consent banner on their website.
5 steps to create a GDPR compliant cookie consent solution for your website
Give users a notice using a banner or pop-up with clear and comprehensive information on the use & purposes of cookies. Ensure you add a link to your cookie policy or privacy policy in the notice
Set the cookies only when the user has given consent for cookies
Give users an option to Accept and Reject cookies
Create a second layer where users can give consent to each purpose of cookies separately
Create a permanent link or button for users to withdraw cookie consent. This should be placed on your home page or privacy policy page.
On implementing the above steps, your website should have the following flow for cookie consent:
GDPR Compliant Cookie Consent Banner
Table of Contents
What’s cookie consent
Cookie Consent Banner Design
GDPR Cookie Consent Examples
Cookie Consent Script
Cookie Consent for Google Tag Manager
Cookie Consent for Website Builders
Conclusion
What’s cookie consent
Cookie Consent is the process by which websites take user’s consent to set cookies. It started with Europe’s ePrivacy directive or the cookie law. With ePrivacy, it was mandatory for websites to take consent from users before storing or accessing cookies from user’s devices. However, websites termed consent as showing cookie banners with just an Ok button or implying consent by use of the website. GDPR clearly defines what constitutes as a legal consent and hence now cookie consent has to include an option to deny alongside accept and a method to give consent based on purposes or use of cookies like Targeting, Analytics amongst others.
When is cookie consent needed?
Cookie consent is needed:
If you offer your product or service to EU customers, including a free product or service. For example, media websites like Techcrunch are free services for EU customers
If you are targeting EU customers, this is indicated if you have an EU domain like .eu, .de or you offer local currencies, local language on your website or you are advertising to EU users like an American university advertising for its courses in EU
What happens if you don’t comply with cookie consent?
Users can file a complaint against your company with the Data Protection Authority of your country and this could lead to fines under GDPR. Recently, the Data Protection Authority of Ireland, Germany have started a sweep of websites to check if they comply with cookie consent and will be sending notices soon. Here are some actions are taken for not complying with cookie consent:
Planet 49: CJEU Rules on Cookie Consent
Oracle & Salesforce hit with class action GDPR lawsuit
IKEA was fined 10,000 Euros for cookie consent violations
Vueling Airlines was fined 30,000 Euros for not allowing users to give granular consent
Cookie Consent Banner Design:
A cookie consent banner has the following requirements to make it legal:
Cookie Consent Text: This is also termed as Cookie Consent Notice and is used to inform in a simple, clear language that you use cookies on your website.
Cookie Policy: This is a detailed version of your cookie consent notice explaining why you use cookies, a list of cookies with purposes and a method for users to withdraw consent for cookies
Accept & Deny Buttons: These are options for your users to either accept or deny the cookies. You should ensure that you don’t use dark design patterns to give more weight to Accept over Deny. For you cookie consent to be valid in EU, you have to ensure that for users ease of accepting and denying cookies is same.
Cookie Preferences: This should open up a preference center where users can give granular consent for each purpose. ePrivacy allows websites to set Strictly Essential Cookies without consent so that can always be on, for other purposes like Targeting, Analytics you should allow users to switch them on or off.
Is your website cookie compliant?
Scan your website for cookies and generate a compliance report
GDPR Cookie Consent Examples
Cookie Consent is the first interaction that users will have on your website, you should ensure that it is styled according to your website. Smashing Magazines list some ways to create user-friendly cookie consent banners. Some examples we liked and you can take inspiration from:
Techcrunch does a good job with the cookie consent notice. They explain to the users that they are using cookies, explains the use of the cookies, and also links to both privacy and cookie policy. However, they don’t have an option to Reject cookies and thus not a great experience for users who want to reject the use of cookies.
GDPR Cookie Consent Example: Techcrunch
AirBnB gives a preference center for users to give consent on each purpose separately. Here they have clearly defined Performance as one of the purpose explaining the use of cookies. Also, they allow users to switch off cookies for this purpose completely or switch off each cookie individually. In our opinion, consent for each cookie is overkill for users and you should be good with just giving a purpose level option to users.
GDPR Cookie Consent Example: AirBnB
Asos’s cookie banner is well styled but does not give option to users to reject or change cookie settings. This is something we would not recommend.
GDPR Cookie Consent Example: Asos
Webflow gives a nice banner at the bottom but does not give option to the user to accept or reject cookies. You can change your preferences, it would have been much better if we had the buttons on the banner itself.
GDPR Cookie Consent Example: Webflow
Cookie Consent Script
Cookie consent script blocks and unblocks cookies based on the user’s consent. It ensures websites comply with the ePrivacy Directive and GDPR. There are two ways in which the script can function:
Manually block cookies:
Change the class of each script that is setting a script from </type = javascript to </type=text & class = website-category>
The class value will allow you to handle granular cookie consent
Once the user gives you consent, change these scripts to type = javascript and execute them
One of the problems with the manual method is it takes a lot of time and you can miss some scripts which will make your website non-compliant.
Automatically block cookies:
Scans the website for cookies and allows you to categorize them into different categories
Automatically blocks the scripts and unblocks them once the user gives cookie consent
Sign-up to Privado and automatically block cookies with our cookie consent script.
Cookie Consent for Google Tag Manager
Your cookie consent solution should also ensure cookies set via tags from Google Tag Manager are blocked until the user gives consent. Here is how you can you use our cookie consent solution to do that:
Download the container from our dashboard
Import the container to your GTM, it will add the triggers to block and allow tags in your GTM account. Some examples are Allow Analytics, Block Analytics
You can either use the Allow triggers to fire these tags or use the Block triggers and add as an exception for your tags
Go to preview and your tags should only be fired once the user gives consent
Cookie Consent for Website Builders
We offer integration with the following third-party website builder tools to seamlessly display a cookie consent banner:
Conclusion
Cookie Consent seems simple on the outside but involves you to set the right cookie banner, blocking cookies, tags, pixels with the help of a script, and allowing users to change cookie consent from your home page or cookie policy. This guide will help you with setting all these elements for your website.
You can also use this free cookie consent tool and make your website compliant with privacy laws across the world including GDPR.
Originally published at https://www.privado.ai on October 6, 2020.
3 notes · View notes
ablyproconsultant · 2 years ago
Text
A Step-by-Step Guide for Salesforce Service Cloud Implementation
Service Cloud Implementation
Tumblr media
In 2023, more than 11574 businesses started using the Salesforce Service Cloud as a knowledge-management solution. This powerful tool can help streamline your customer service operations and improve customer satisfaction. But where do you start? In this step-by-step guide for Salesforce Service Cloud Implementation, we'll walk you through the process of implementing Service Cloud for your business.
We'll cover everything from setting up your account to customizing your dashboard and managing cases. Whether you're a seasoned Salesforce user or just getting started, this guide is here to help make the process easy and enjoyable.
So, grab a cup of coffee and let's dive in!
Salesforce Service Cloud Implementation Checklist
If you are planning to do Salesforce Service Cloud implementation in your organization, it is important to have a well-defined plan and checklist to ensure a smooth and successful implementation.
Here are some key items to include on your Salesforce implementation checklist:
1. Define Your Business Objectives and Goals for Service Cloud Implementation
Before implementing Salesforce, it's important to understand what your business needs and how Salesforce can meet those needs.
Determine what you want to achieve with the new system and how it will benefit your organization.
Clearly define what you want to achieve with the Salesforce implementation and ensure that all stakeholders are on the same page.
2. Choose the Right Salesforce Edition and Licenses for your Organization
Salesforce offers different editions with various features and pricing, so choose the one that fits your business needs and budget.
Consider your budget, number of users, and required features when selecting the right edition.
3. Customize Your Salesforce Instance
Customize Salesforce to fit your business processes & workflows and configure the platform to meet your business needs by creating custom fields and objects.
This includes configuring fields, page layouts, and validation rules.
4. Assemble a Team
In this phase, you can assign a project team and identify a project manager to oversee the implementation process or find the best Salesforce Implementation partner.
This team should include representatives from different departments in your organization.
5. Identify Your Data
Determine what data you need to move over to Salesforce and create a plan for migrating it.
6. Import Data Into Salesforce
For migrating your existing data into Salesforce, you need to develop a data migration strategy to ensure all relevant data is transferred accurately from your current system to Salesforce. Considerations for data migration includes:
Matching metadata must be developed in the new target org before data can be imported from one org to another.
Experienced administrators or developers can migrate metadata by Deploying and Retrieving Metadata with a client tool like the Ant Migration Tool.
Other target instance adjustments include record types, page layouts, etc.
Create sharing models to support new user groups as necessary: Sharing Rules, Roles, and Profiles.
Review the ownership rules.
Ensure that all necessary data points and fields from the source organization are captured in the target.
Before importing data, run a migration test. It's best to test migration in a sandbox first.
Once the migration is complete, perform data validation.
7. Define Your Security Model
Define user roles and permissions to ensure data security and prevent unauthorized access.
Set up permissions and access controls to ensure that data is secure and only accessible by authorized individuals.
8. Train Users
Develop a training plan for users to ensure they are comfortable using the new system and can maximize its potential.
Provide training for all Salesforce users, from administrators to end-users, to ensure proper use of the platform.
9. Integrate With Other Systems
Integrate Salesforce with other systems, such as marketing automation platforms or customer service software, to create a seamless experience for your customers.
Here are some of the recommended Systems integrations in Salesforce Service Cloud:
Jitterbit Salesforce Service Cloud Integration.
Salesforce Service Cloud Custom Integration.
MuleSoft Integration with Salesforce Service Cloud.
Salesforce Service Cloud Integration with webMethods.io.
10. Test and Validate
Test the system extensively before going live to ensure that everything is working properly.
Test all customizations, workflows, and integrations to ensure they work correctly.
11. Go live
Launch the system and monitor its performance to identify any issues that need to be addressed.
Regularly maintain and update your Salesforce instance to keep it running smoothly and efficiently.
12. Ongoing Support
In this phase, you can plan for ongoing maintenance and support of the system after implementation. Regularly maintain and update your Salesforce instance to keep it running smoothly and efficiently.
By following these steps and having a comprehensive checklist in place, you can ensure a successful Salesforce Service Cloud implementation for your organization.
Conclusion
Salesforce Service Cloud can be a game-changer when it comes to customer service and support. With its powerful tools and features, Service Cloud enables businesses to provide exceptional customer service experiences that keep customers coming back for more. We hope this guide for "Salesforce Service Cloud implementation" gives you a thorough understanding of how to use Service Cloud to revolutionize your customer service operations. Take first step and provides value to your organization.
Source: Salesforce Implementation Guide
0 notes
Text
Salesforce Metadata And migSO - The Forever Indestructible bond
Salesforce Metadata is the invisible glue that holds everything together in your Salesforce org.
However, managing Salesforce metadata operations becomes daunting, especially when dealing with multiple Salesforce orgs. This is where migSO comes into the picture. This powerful Salesforce app simplifies and streamlines metadata operations and lets you efficiently manage them. Let’s examine the indestructible bond between metadata and migSO and discover how it can transform your Salesforce experience.
Tumblr media
What is Salesforce Metadata?
Metadata in Salesforce refers to the data that defines data. It incorporates all the elements that help you configure and customize Salesforce instances. Salesforce Metadata describes how your object behaves and specifies the “look and feel” of your Salesforce org. Here’s what Salesforce Metadata includes:
Fields
Objects
Validation Rules
Layouts
Field Sets
Static Resources
Labels
Global Picklist Value Sets
Permission Sets
Role of migSO in Managing Salesforce Metadata
As your business grows, managing Salesforce metadata operations becomes much more complex. Traditional methods are incapable of handling such large-scale operations, leading Salesforce developers and admins to seek more advanced solutions.
migSO, an abbreviation for Migrate Salesforce Org,” is a native Salesforce application that enhances how you handle metadata operations. migSO helps you break stereotypes and effectively manage metadata at your fingertips. It empowers you to clone, export, and manage Salesforce metadata seamlessly.
The Three Main Pillars of migSO
Here are some top-notch features of our latest Salesforce app, migSO-
Seamless Cloning- migSO enables you to clone metadata items, such as fields, objects, layouts, and more, from one Salesforce org to another with just a few clicks. It minimizes the chances of errors that can occur during manual cloning.
Effortless Exporting- migSO lets you export important metadata items with a few clicks. It empowers you to export profiles, permission sets, validation rules, and other metadata items ensuring you have time to focus on key business areas.
Hassle-free Management- migSO also fetches next-level tools to ensure hassle-free metadata management. With detailed insights and optimization suggestions, improving the performance of your Salesforce org becomes easier.
The Indestructible Bond between Salesforce Metadata & migSO
Are you ready to revolutionize your Salesforce metadata operations? The synergy between Salesforce metadata and migSO makes it all possible. Here’s how they both work together, creating an indestructible bond-
Efficiency: migSO lets you handle complex metadata operations effortlessly, enabling your team to manage Salesforce orgs more efficiently.
Accuracy: migSO reduces the chances of errors by automating the metadata migration process while maintaining metadata integrity and system performance.
Scalability: As your business evolves, so do your metadata operation needs. migSO encourages scalability by providing all the necessary tools to manage higher volumes of metadata seamlessly.
Conclusion
migSO does share an indestructible bond with metadata as it empowers admins and developers to seamlessly migrate, export, and manage different metadata items within a few clicks. As your business evolves, having the right tools to manage your metadata is essential for success. migSO provides the solutions you need to navigate the complexities of Salesforce metadata operations, allowing you to focus on driving your business forward.
Who Are We?
Tech9logy Creators is a well-established Salesforce App Development Company with over 10 years of experience. Our certified professionals share great expertise in Salesforce App Exchange and develop diverse apps while adhering to its guidelines. Opt for our Salesforce App Development Services to get a customized app for your enterprise today.
Contact us for more information.
0 notes
wirecrm · 3 years ago
Text
Salesforce Revenue Cloud – The Ideal Tool to Optimize Business Growth
For years, companies across the world relied on strong networks of sales reps to connect with their clients. However, the outbreak of the COVID-19 pandemic led to a vast majority of customers reaching out to businesses through various digital channels. Many firms believe this trend will continue for a long time, even well after the pandemic subsides.
Against this backdrop, how can companies manage and increase their sales revenues? What does it take to ensure seamless management of revenue data across multiple channels? How can firms come up with novel business models to meet the dynamic needs of their customers in the post-pandemic world? They need to use a robust application to optimize their revenue cycles; one of the most popular tools used by organizations to augment their revenue management efficiencies is Salesforce Revenue Cloud.
Today, we’ll look at the capabilities of Salesforce Revenue Cloud and how it helps meet your growth goals in a rapidly-changing business environment.
Innovative Features of Salesforce Revenue Cloud that Empower Your Success
Excellent Support to Subscription-based Business Models
Salesforce Revenue Cloud is developed to deliver top-notch support to a wide variety of subscription-based revenue models; you can utilize the revenue optimization solution to use billing models such as one-time purchase, perpetual licenses or milestone-determined prices. You can also charge a flat amount or bill users on the basis of the numbers of a product purchased by them.
Smooth Management of Varying Demand for Products
We are living in uncertain times, where companies must be ready to deal with highly fluctuating demand for their products. Salesforce Revenue Cloud goes a long way in empowering firms to handle dynamic demand levels with high efficacy; the robust revenue management application helps businesses scale delivery of services and optimize sales. You can offer new products as add-ons to your customers and simplify renewal processes.
Complete Visibility of Revenue Data
Salesforce Revenue Cloud facilitates total visibility into the data pertaining to your revenue-generating activities. The powerful solution can be integrated with a wide range of enterprise software systems including Enterprise Resource Planning (ERP) and accounting applications, enabling smooth, timely flow of business data. You can also fully integrate Salesforce Revenue Cloud with Einstein Analytics, Salesforce’s AI-powered analytics platform.
Hassle-free Facility to Provide Pricing Information
You can use Revenue Cloud to help your customers find information about the prices of your products easily on their preferred channel. You can enable your customers to access product catalogs at any point of time with little effort. The smart revenue optimization solution from Salesforce also enables you to automatically update the catalogs with the latest prices. Furthermore, Salesforce Revenue Cloud allows your customers to connect with your reps using any connected device.
Seamless Automation of Revenue Cycles
Salesforce Revenue Cloud helps you improve your revenue handling efficiency considerably, by automating the entire revenue cycle. You can define rules to enable automatic validation of business transactions, while fully complying with applicable norms and policies. The Salesforce revenue management solution also helps you shorten revenue cycles and streamline your route pricing process.  
As you can see, Salesforce Revenue Cloud is a feature-packed application that enables you to optimize your revenue-generating processes and meet the ever-evolving needs of your customer. You must tie up with a reputed Salesforce partner to make the best use of this powerful product. Make sure the partner fully understands your specific business needs, in order to get the highest ROI on the novel solution.
Hope you liked this post. How do you use Salesforce Revenue Cloud? We’d love to know.
1 note · View note
manrastechnology · 1 year ago
Text
How to choose the Right Salesforce Augmentation Partner?
Tumblr media
What is salesforce augmentation?
Salesforce Augmentation helps the business firm in digital transformation by providing them with all the necessary tools, advice, insights, and skills that ultimately help the firm to improve output, and customer service, and meet the goals of the company. 
Why do you need a Salesforce augmentation partner?
Mentioned below are some of the ways Salesforce Augmentation helps:
Better Customer Relationship Management(CRM): Salesforce will help a company to find the right target audience, and pin out the likes and dislikes of the customer base. This will result in a better CRM which will improve business.
Task Automation: Salesforce teams have automation tools like Process Builder and Workflow Rules which help in integrating a business with other enterprise systems like ERP, Marketing Automation, and other e-commerce platforms. 
Analytical Decision Making: by making use of tools like Salesforce Einstein Analytics salesforce augmentation can help any firm to reach success. Salesforce helps a company look through the data and make sound decisions based on them.
Unique Customer Experience: Good Customer service and even better customer feedback is one of the ways to success for a company. Understanding the needs of the customer and keeping track of individuals’ likes and dislikes is something Salesforce Augmentation can help with.
Flexibility: Salesforce Augmentation induces flexibility and adaptability in the function of any project. This helps in smoother decision-making and saves time to focus on the major tasks.
How to choose the best Salesforce augmentation partner?
If you want to get the right Salesforce augmentation partner then it is important for the implementation of Salesforce. Here’s a detailed guide to help you make the right choice:
Define Your Requirements: Clearly state the objectives of your business, project scope, timelines, and budget limits. Check for the specific areas where you need Salesforce augmentation.
Need for skills: Look out for partners with wide experience and knowledge in Salesforce. Make sure to check their certifications, industry experience, and client feedback. A partner with a good record of successful projects similar to your requirements is always preferred.
Gather Industry Knowledge: Consider partners who understand your industry and its unique challenges. Industry-specific expertise can ensure that the solution meets your business needs effectively.
Review Service Offerings: Assess the range of services offered by potential partners, including consulting, implementation, customization, integration, training, and support. Ensure that their services align with your project requirements.
Check Technology Proficiency: Verify the partner’s proficiency in Salesforce technologies, including Salesforce Lightning, Apex, Visualforce, Salesforce CPQ, Salesforce Einstein, etc. They should be up-to-date with the latest Salesforce releases and best practices.
Evaluate Communication and Collaboration: Effective communication and collaboration are essential for a successful partnership. Evaluate the partner’s communication channels, responsiveness, and ability to understand your needs.
Consider Cultural Fit: Assess the cultural fit between your organization and the partner. A compatible working culture can facilitate smoother collaboration and better outcomes.
Review Support and Maintenance: Ensure that the partner offers ongoing support and maintenance services post-implementation. Check their support model, response times, and escalation procedures.
Assess Scalability: Consider the partner’s scalability to accommodate your future growth and evolving needs. They should have the capacity to scale resources and support additional functionalities as your business expands.
Check References and Case Studies: Request references and case studies from the partner to validate their capabilities and success stories. Speaking with their previous clients can provide valuable insights into their performance and customer satisfaction levels.
Wrapping up 
Selecting the right Salesforce augmentation manager is now more valuable than ever for a successful digital transformation. Specify your needs upfront and match up with partners that can deliver the level of quality you need; pay attention to their credentials and specializations that fit your industry. 
Evaluate your service with Manras Salesforce augmentation partner because we commit to complete support from the service and maintenance after installation. 
0 notes
siva3155 · 6 years ago
Text
300+ TOP WINDOWS SYSTEM ADMINISTRATOR Objective Questions
Windows System Administrator Multiple Choice Questions :-
1. Which of the following administrative thinkers has defined administration as "the organization and direction of human and material resources to achieve desired ends" ? A. L. D. White B. J. M. Pfiffner C. J. A. Veig D. H. A. Simon Ans : B 2. Which one of the following statements is not correct in respect of New Public Management ? A. It has market orientation B. It upholds public interest C. It advocates managerial autonomy D. It focuses on performance appraisal Ans : B 3. 'Good Governance' and 'Participating Civil Society for Development' were stressed in World Bank Report of— A. 1992 B. 1997 C. 2000 D. 2003 Ans : A 4. If the administrative authority within a department is vested in a single individual, then that system is known as— A. Board B. Bureau C. Commission D. Council Ans : B 5. Globalisation means— A. Financial market system is centered in a single state B. The growth of a single unified world market C. Geographical location of a firm is of utmost importance D. Foreign capitalist transactions Ans : B 6. By whom was the 'Managerial Grid' developed ? A. Blake and White B. Blake and Schmidt C. Blake and Mouton D. Mouton and Shophan Ans : C 7. Who among the following says that public administration includes the operations of only the executive branch of government ? A. L. D. White and Luther Gulick B. L. D. White C. Luther Gulick D. W. F. Willoughby Ans : C 8. The concept of the 'zone of indifference' is associated with— A. Decision-Making B. Leadership C. Authority D. Motivation Ans : C 9. Who has analysed the leadership in terms of 'circular response' ? A. C. I. Barnard B. M. P. Follett C. Millet D. Taylor Ans : B 10. Simon proposed a new concept of administration based on the methodology of— A. Decision-making B. Bounded rationality C. Logical positivism D. Satisfying Ans : C
Tumblr media
WINDOWS SYSTEM ADMINISTRATOR Objective Questions 11. Who wrote the book 'Towards A New Public Administration : The Minnowbrook Perspective' ? A. Frank Marini B. Dwight Waldo C. C. J. Charlesworth D. J. M. Pfiffner Ans : A 12. Who rejected the principles of administration as 'myths' and 'proverbs' ? A. W. F. Willoughby B. Herbert Simon C. Chester Barnard D. L. D. White Ans : B 13. The classical theory of administration is also known as the— A. Historical theory B. Mechanistic theory C. Locational theory D. Human Relations theory Ans : B 14. How many principles of organization were propounded by Henry Fayol ? A. 10 B. 14 C. 5 D. 9 Ans : B 15. Simon was positively influenced by ideas of— A. Terry B. Barnard C. L. D. White D. Henry Fayol Ans : B 16. Negative motivation is based on— A. Fear B. Reward C. Money D. Status Ans : A 17. 'Job loading' means— A. Shifting of an employee from one job to another B. Deliberate upgrading of responsibility, scope and challenge C. Making the job more interesting D. None of the above Ans : B 18. The theory of 'Prismatic Society' in Public Administration is based on— A. Study of public services in developed and developing countries B. Institutional comparision of public administration in developed countries C. Structural-functional analysis of public administration in developing countries D. Historical studies of public administration in different societies Ans : C 19. Who among the following is an odd thinker ? A. Taylor B. Maslow C. Herzberg D. Likert Ans : A 20. Which of the following is not included in 'hygiene' factors in the Herzberg's two-factor theory of motivation ? A. Salary B. Working conditions C. Company's policy D. Responsibility Ans : D 21. What should a system administrator use to disable access to a custom application for a group of users? A. Profiles B. Sharing rules C. Web tabs D. Page layouts Ans :A 22. Universal Containers needs to track the manufacturer and model for specific car companies. How can the system administrator ensure that the manufacturer selected influences the values available for the model? A. Create the manufacturer field as a dependent picklist and the model as a controlling picklist. B. Create a lookup field from the manufacturer object to the model object. C. Create the manufacturer field as a controlling picklist and the model as a dependent picklist. D. Create a multi-select picklist field that includes both manufacturers and models. Ans :C 23. Sales representatives at Universal Containers need assistance from product managers when selling certain products. Product managers do not have access to opportunities, but need to gain access when they are assisting with a specific deal. How can a system administrator accomplish this? A. Notify the product manager using opportunity update reminders. B. Enable sales teams and allow users to add the product manager. C. Use similar opportunities to show opportunities related to the product manager. D. Enable account teams and allow users to add the product manager Ans :B 24. What should a system administrator consider before importing a set of records into Salesforce? There are two correct answers.. A. The import file should include a record owner for each record. B. Currency field values will default to the personal currency of the record owner. C. Data should be de-duplicated in the import file prior to import. D. Validation rules are not triggered when importing data using the import wizard. Ans :A,C 25. Which statement about custom summary formulas in reports is true? There are two correct answers.. A. Reports can be grouped by a custom summary formula result. B. Custom summary formulas can reference a formula field within a report. C. Custom summary formulas can reference another custom summary formula. D. Custom summary formulas can be used in a report built from a custom report type. Ans :B,D 26. Which of the following utilities provides a report of memory status for instances, databases, and agents? A.db2mtrk B.db2mchk C.db2expln D.db2memview Ans : A 27. Given the following notification log entry: In order to determine the name of the application which encountered the error, which of the following actions must be taken? A.Issue DB2 LIST DCS APPLICATIONS and search for AC14B132.OB12.0138C7070500 B.Issue DB2 LIST APPLICATIONS and search for AC14B132.OB12.0138C7070500 C.Issue DB2 LIST DCS APPLICATIONS and search for 660 D.Issue DB2 LIST APPLICATIONS and search for 660 Ans : B 28. Given the table MYTAB: A.The index MYINX will not be created. B.The word UNIQUE will be omitted by DB2 and a non-unique index MYINX will be created. C.The unique index MYINX will be created and the rows with duplicate keys will be deleted from the table. D.The unique index MYINX will be created and the rows with duplicate keys will be placed in an exception table. Ans : A 29. Which of the following REORG table options will compress the data in a table using the existing compression dictionary? A.KEEPEXISTING B.KEEPDICTIONARY C.RESETDICTIONARY D.EXISTINGDICTIONARY Ans : B 30. Which of the following statements about SCHEMA objects is true? A.A schema must always be associated with a user. B.Triggers and sequences do not have schemas associated with them. C.After creating a new database, all users who successfully authenticate with the database have the ability to create a new schema. D.If a schema is not explicitly specified in a SQL statement, the PUBLIC schema is assumed. Ans : C 31. An iseries server with domino and Websphere Commerce suite installed is performing well, except when the Domino or Websphere servers are starting. Which of the following is the most likely cause of the performance problem. A. Job Queue ASERVER in subsystem ASERVER is set to single thread jobs. B. The activity level in the shared pool running the servers is too low. C. The managing Domino server instance is not started prior to the Websphere server instance. D. The system value QMLTTHDACN Multithreaded job action. is set to stop non-thread safe processes. Ans : B 32. A system administrator needs to add 100 users to a V6R2 system without impacting response times, Which of the following would be the first step in determining the current performance of the system? A. Define a performance collection agent in iDoctor for iSeries. B. Define a performance collection object within iSeries navigator. C. Use Performance Explorer to collect generalized performance data. D. Use the Workload Estimator to show existing performance constraints. Ans : B 33. A batch subsystem is established to run jobs from multiple queues, An application submits two jobs to each queue. All jobs are in release status. Which of the following is the reason that only three jobs have started to run? A. The total MAXACT for all fo the job queue entries is three. B. Only three of the job queue priorities have been defined in the subsystem. C. The activity level setting associated with the subsystem description is 3. D. The execution of the pending jobs, is governed within the application of the jobs currently running. Ans : A 34. What command is used to save the IFS to taps? A. SAV B. SAVRST C. SAVDLO DLO IFS. D. SAVSYS OBJTYPE IFS.. Ans : A 35. A Manufacturing company has three remote sites and a total of six distributed AS/400 systems. The company would like to accomplish the followings. A. Centralize to a single system. B. Maintain each system workload and identity attributes. C. Reduce the total cost of ownership of maintaining individual systems. D. Maintain existing performance levels. Ans : 36. Which of the following can have an impact on determining the interactive workload requirements of an iSeries? A. Active subsystems. B. Active controllers. C. Active user profiles. D. Active display sessions. Ans : D 37. On a system that does not have Performance Tools for iSeries 5722-PTI., how can performance data be collected for analysis on a different system that has Performance Tools installed. A. Use Electronic Service Agent for iSeries. B. Use the Start Performances Monitor command. C. Start a Performance Monitor in iSeries Navigator. D. Install and run OS/400 product option 42, performance Collection Client. Ans : C 38. Which fo the following commands will save all objects in the IFS? A. SAV B. SAVDLO C. SAVLIB "IFS" D. SAVLIB NONSYS. Ans : A 39. Which system performance command can send output to both the online serene and a database file simultaneously. A. WRKSYSACT B. WRKACTJOB C. WRKDSKSTS. D. WRKSYSSTS. Ans : A 40. Using the example below, what is the cumulative PTF level of the system? Display PTF status. System: ANYSYS. Product ID.............. 5722999 IPL source............... ## MACH#B elease of base option ........V5R1M0.LOO. PTF IPL Opt ID Status Action. TL02134 Temporarily applied None. TL02071 Permanently applied None. TL01226 Superseded None. RE01066 Permanently Applied None. QLL2924 Permanently Applied None. MF29379 Permanently Applied None. MF29287 Permanently Applied None. A. TL02071 B. TL02134 C. RE01066 D. MF29379 Ans : B 41. Job description FREDJOBD has public authority of USE. This job description specifies that jobs run under user profile FRED. Which has public authority EXCLUDE*. Use profile SUE is user class USER* with default special authorities and does not have specific authority to use profile FRED. User which security levels would user profile SUE be able to successfully submit a job using PREDJOBD with the intent to run the job under user profile FRED? A. At 20 only. B. At 20 and 30. C. At 20, 30, and 40. D. At 20,30,40, and 50. Ans : B 42. On an iSeries running V5R2, which fo the following authorities needs to be removed from an object to ensure that the object owner cannot delete it? A. Object alter authority. B. Object existence authority. C. Object Management authority. D. Object operational authority. Ans : B 43. Which of the following disk drive considerations has the greatest impact on system performance? A. Number of disk drives on a bus. B. User versus available disk space. C. Data protection method RAID verses mirroring.. D. Peak rate of disk requests versus the number of disk arms. Ans : D 44. Which of the following will enable an administrator to establish an ongoing analysis of the performance and utilization of an iSeries. A. Upload the performance data to the IBM workload Estimator. B. Activate Performance Manage/400 to automatically upload data to IBM. C. Enable PPP communications between the iSeries and the IBM Benchmark Center. D. Utilize the no-charge OS/400 performance analysis tools to print monthly reports. Ans : B 45. Users are reporting long response time delays in transactions that previously would run with sunscald response. The system administrator knows these transactions are RPG IV-SQL based programs. Which of the following would be the first set in determining the problems. A. Use Management Central to start a job monitor for system and select the SQL submonitor display. B. Use iSeries Navigator, SQL Performance Monitor to conllect and analyze the SQL performance of the jobs. C. Start debug on one of the on-line jobs and use Performance Explorer to analyze the jobs database access plan. D. Use iSeries Navigator to create a database map of the tables used in the transactions and watch for high I/O rates. Ans : B 46. A system administrator wants to collect performance data for multiple iSeries servers in a network. When attempting to start performance collection using iSeries Navigator, collection services fail. Which of the following is the most likely cause of the problem. A. The Management Central, central system is not connecting. B. Performance Tools 5722.PTI. is not installed on all systems. C. The user profile being used to start collection does not have *SYSADM special authority. D. The Management Central performance collector plug-in is not installed in iSeries Navigator. Ans : A 47. Which of the following can be adjusted daily by a scheduled job to maximize interactive or bath throughput? A. Memory pools. B. QACTJOB system value. C. Auxiliary storage pools. D. QMAXACTLVL system value. Ans : A 48. Which of the following would the most viable course of action? A. Implement High Availability and replicate each system to a single system. B. Implement LPAR on a centrally located system and consolidate the individual systems onto multiple partitions. C. Transfer all systems to the central location and create an iSeries cluster to form a single "logical" system. D. Consolidate all the systems into a single partition on a syst3em with a CPW rating that matches or exceeds the accumulated CPW of the individual systems. Ans : B 49. Which of the following should be considered when preparing for a release upgrade? A. User changes made to IBM supplied commands. B. Valid license information for the current release. C. Presence of user libraries after QSYS in the system library list. D. Installation of the latest cumulative package for the current release. Ans : A 50. An iSeries Administrator would like to monitor the performance of all four of the machines in the data center at one time. If a certain level of interactive CPU is reached, the administrator wants to receive an email. What needs to be done to satisfy the requirements. A. In Management Central, set up a system group monitor job with a threshold action. B. In iSeries Navigator create endpoints for each of the systems and set up a threshold job monitor. C. Add an exit program to the WRKACTJOB command to intercept the CPU statistics and process the email. D. Start PM/400 and use the PM/400 notification options to send the email when the threshold is reached. Ans : A 51. A DBA wishes to audit all access to the non-audited table OWNER.EMPLOYEE. Assuming no audit traces are started, which of the following steps are needed to audit access to this table? A. -START TRACE AUDIT CLASS 5. B. -START TRACE AUDIT CLASS 4, 5. C. -START TRACE AUDIT CLASS 4, 5. and ALTER TABLE OWNER.EMPLOYEE AUDIT ALL D. -START TRACE AUDIT CLASS 4, 5. and ALTER TABLE OWNER.EMPLOYEE DATA CAPTURE CHANGES Ans : C 52. A company uses TRUSTED CONTEXT "ERP1" and ROLE "ERP_ROLE" as a security mechanism to limit security exposure for an application. All the DB2 objects databases, table spaces, tables, indexes, views, plans and packages. have been created by that ROLE. The ROLE "ERP_ROLE" has been assigned to User ID "DBA01" in order to perform DBA related tasks. When the user "DBA01" leaves the company, the authorization ID is removed. Which of the following statements are correct? Select two answers. A. None of these DB2 objects need to be recreated to re-grant the privileges. B. The related plans and packages have to be recreated and the privileges re-granted. C. When removing user "DBA01" privileges, none of these DB2 objects need to be dropped. D. Only the related databases, table spaces, tables, indexes and views need to be recreated and the privileges re-granted. E. To remove the privileges of user "DBA01" on these related plans and packages, they have to be dropped and as a result all associated privileges are revoked. Ans : A,C 53. At which of the following times is the access control authorization routine DSNX@XAC. invoked? A. At DB2 startup. B. When executing a DB2 GRANT statement. C. When DB2 has cached authorization information. D. During any authorization check if NO was specified in the USE PROTECTION field of the DSNTIPP panel. Ans : A 54. If an object is created statically by a role within a trusted context and the ROLE AS OBJECT OWNER clause is specified, who becomes the object owner when executing the package? A. The role B. The schema name C. The owner keyword D. The current SQLID if set. Ans : A 55. A DBA needs to use the DSN command processor to delete DB2 packages that are no longer needed. Which of the following choices is correct for the DBA to use? A. SPUFI or QMF with the DROP statement B. FREE Package ... C. DROP Package ... D. DROP PLAN . PKLIST ... Ans : B 56. Which of the following tools is used to create subscription sets and add subscription-set members to subscription sets? A. Journal B. License Center C. Replication Center D. Development Center Ans : C 57. Given the following table definition: STAFF id INTEGER name CHAR20. dept INTEGER job CHAR20. years INTEGER salary DECIMAL10,2. comm DECIMAL10,2. Which of the following SQL statements will return a result set that satisfies these conditions: -Displays the department ID and total number of employees in each department. -Includes only departments with at least one employee receiving a commission comm. greater than 5000. -Sorted by the department employee count from greatest to least. A. SELECT dept, COUNT*. FROM staff GROUP BY dept HAVING comm > 5000 ORDER BY 2 DESC B. SELECT dept, COUNT*. FROM staff WHERE comm > 5000 GROUP BY dept, comm ORDER BY 2 DESC C. SELECT dept, COUNT*. FROM staff GROUP BY dept HAVING MAXcomm. > 5000 ORDER BY 2 DESC D. SELECT dept, comm, COUNTid. FROM staff WHERE comm > 5000 GROUP BY dept, comm ORDER BY 3 DESC Ans : C 58. Which of the following DB2 data types CANNOT be used to contain the date an employee was hired? A. CLOB B. TIME C. VARCHAR D. TIMESTAMP Ans : B 59. A table called EMPLOYEE has the following columns: NAME DEPARTMENT PHONE_NUMBER Which of the following will allow USER1 to modify the PHONE_NUMBER column? A. GRANT INDEX phone_number. ON TABLE employee TO user1 B. GRANT ALTER phone_number. ON TABLE employee TO user1 C. GRANT UPDATE phone_number. ON TABLE employee TO user1 D. GRANT REFERENCES phone_number. ON TABLE employee TO user1 Ans : C 60. Which two of the following SQL data types should be used to store a small binary image? A. CLOB B. BLOB C. VARCHAR D. GRAPHIC E. VARCHAR FOR BIT DATA Ans : B,E 61. Which of the following CLI/ODBC functions should be used to delete rows from a DB2 table? A. SQLDelete. B. SQLExecDirect. C. SQLBulkDelete. D. SQLExecuteUpdate. Ans : B 62. Given the tables T1 and T2 with INTEGER columns: How many rows will be left in T1 after running this statement? A. 0 B. 2 C. 3 D. 6 Ans : B 63. Given the following code: EXEC SQL EXECUTE IMMEDIATE :sqlstmt Which of the following values must sqlstmt contain so that all rows are deleted from the STAFF table? A. DROP TABLE staff B. DELETE FROM staff C. DROP * FROM staff D. DELETE * FROM staff Ans : B 64. An ODBC/CLI application executes the following batch SQL: SQLExecDirect hStmt, "SELECT c1 FROM t1; SELECT c2 FROM t2;" SQL_NTS .; Which API is used to discard the first result set and make the second available for processing? A. SQLFetch. B. SQLRowCount. C. SQLMoreResults. D. SQLCloseCursor. Ans : C 65. Given the table T1 with the following data: What is the final content of the host variable "hv"? A. A B. B C. C D. D Ans : C 66. Which of the following is required to specify the output data during an EXPORT? A. Union B. Key range C. Equi-join D. Fullselect Ans : D 67. On which of the following event types can a WHERE clause be used to filter the data returned by the event monitor? A. TABLES B. DEADLOCKS C. TABLESPACES D. CONNECTIONS Ans : D 68. Given an application with the embedded static SQL statement: INSERT INTO admin.payroll employee, salary. VALUES "Becky Smith",80000. Which of the following privileges must a user hold to run the application? A. ALTER on the table B. INSERT on the table C. DBADM on the database D. EXECUTE on the package Ans : D 69. Assuming a user has CREATETAB privileges, which of the following privileges will allow the user to create a table T2 with a foreign key that references table T1? A. SYSCTRL B. SYSMAINT C. UPDATE on table T1 D. CONTROL on table T1 Ans : D 70. Which of the following statements is required to register a federated database source? A. CREATE VIEW B. CREATE WRAPPER C. CREATE TRANSFORM D. CREATE TYPE MAPPING Ans : B 71. Which two of the following can be done to a buffer pool using the ALTER BUFFERPOOL statement? A. Reduce the size. B. Change the page size. C. Modify the extent size. D. Modify the prefetch size. E. Immediately increase the size. Ans : AE 72. A table is defined using DMS table spaces with its index, data, and long data separated into different table spaces. The table space containing the table data is restored from a backup image. Which of the table's other table spaces must also be restored in order to roll forward to a point in time prior to the end of the logs? A. Long table space B. Index table space C. Temporary table space D. Index and long table spaces Ans : D 73. In a federated system, a userid is used for the CREATE SERVER definition at the DB2 data source. Which of the following privileges should it have? A. SELECT on the catalog at the DB2 data source B. UPDATE on the CREATE SERVER definition table C. SELECT on the tables for which nicknames are created D. INSERT on the tables for which nicknames are created Ans : A 74. Which of the following can enable multiple prefetchers for table spaces with a single container? A. NUM_IOSERVERS B. INTRA_PARALLEL C. DB2_PARALLEL_IO D. DB2_STRIPED_CONTAINERS Ans : C 75. Given a single physical server with a single OS image utilizing 24 CPUs and configured with 12 database partitions, how many of the database partitions can act as coordinator partitions for remote applications? A. 1 B. 2 C. 12 D. 24 Ans : C 76. When using AUTOCONFIGURE and a workload that is of type OLTP, what should be used for the workload_type parameter? A. dss B. mixed C. simple D. complex Ans : C 78. To determine the state of a table space and the location of all containers for that table space, which of the following commands would be used? A. LIST TABLESPACES SHOW DETAIL B. GET SNAPSHOT FOR TABLESPACES ON C. LIST TABLESPACE CONTAINERS FOR D. GET SNAPSHOT FOR TABLESPACES ON SHOW DETAIL Ans : B 79. The following command is issued: LOAD FROM staff.ixf OF IXF REPLACE INTO staff What locks are help during the execution of this load? A. Share lock on table STAFF B. Exclusive lock on table STAFF C. Exclusive lock on all table spaces holding components of the STAFF table D. Exclusive lock only on table space holding data component of the STAFF table Ans : B 80. Use the exhibit button to display the exhibit for this question. What is indicated by the output from the Health Monitor in the exhibit? A. 1 warning alert on NTINST, 2 alarm and 1 warning alert on NTDB B. 1 warning alert on NTINST, 1 alarm and 1 warning alert on NTDB C. 1 alarm and 1 warning alert on NTINST, 2 alarm and 1 warning alert on NTDB D. 1 alarm and 1 warning alert on NTINST, 1 alarm and 1 warning alert on NTDB Ans : B WINDOWS SYSTEM ADMINISTRATOR Questions and Answers pdf Download Read the full article
0 notes
quantoknack2 · 3 years ago
Text
Tumblr media
How To Define Validation Rules In Salesforce
With so many changes made to Salesforce records on a regular basis by employees, it may be challenging to keep track of the quality of the input data. Salesforce uses Validation Rules to protect users from making mistakes (such as entering phone numbers in the wrong format), establishing verification boundaries, and standardizing data requirements in general.
How To Define Validation Rules In Salesforce
0 notes
a-alex-hammer · 6 years ago
Text
MIT Professor Silvio Micali’s Algorand Project Announces Date for First Auction
MIT Professor Silvio Micali‘s Algorand project, via the Algorand Foundation, will hold its first in a series of Dutch Auctions out of Singapore on On June 19th, 2019. These auctions will allow Algos, the native tokens of the Algorand platform, to enter circulation via a mechanism where the market determines the fair price. This inaugural auction also marks the launch of the Algorand network.
The co-inventor of zero-knowledge proofs, Micali is an Italian computer scientist at MIT Computer Science and Artificial Intelligence Laboratory and a professor of computer science in MIT’s Department of Electrical Engineering and Computer Science since 1983. His research centers on the theory of cryptography and information security. He won the Turing award together with Shafi Goldwasser in 2012 and the Gödel Prize in 1993.
advertisement
The auction also signifies the launch of the Algorand network, which validates the maturation of MIT professor and Turing Award winner Silvio Micali’s vision to create a more robust and efficient blockchain platform that solves the blockchain trilemma and will ultimately transform the underlying technology of our global financial systems.
Algorand’s TestNet has demonstrated the platform’s ability to solve the blockchain trilemma with key milestones including:
Demonstrating decentralization -TestNet has over 500 nodes deployed around the world
Showcasing scalability – In a series of performance tests with 10,000 globally distributed participants, TestNet consistently reaches over 1,000 transactions per second (tps) with five second latency.
Approved Security– Algorand has recently undergone security audits from Trail of Bits and NCC.
Additionally, a number of organizations have been actively engaged with the Algorand platform in its TestNet phase including Top Network, AssetBlock, Otoy Inc. and Syncsort. These organizations are leveraging Algorand to build new solutions for the borderless economy.
The Algorand Foundation believes in the value of the Algorand platform, the Algo, and the potential for everyone to have an opportunity as we enter a new, borderless economy. With a goal of investing in the sustainability and performance of that economy, the foundation has also announced a refund policy for auction participants. Should any participant be dissatisfied with their purchase of Algos, they may return their Algos back to the foundation one-year post-purchase at up to 90 percent of the value paid.
For the first auction, 25 million Algos will be available at a starting price of $10.00 USD and a reserve price of $0.10 USD. The auction will last for 4,000 blocks (about 5 hours), all bids will be posted to and processed from the Algorand blockchain for transparency and Algos will be distributed after the close of the auction. Details from the foundation regarding token dynamics can be found at algorand.foundation/token-dynamics.
Who:
The Algorand Foundation invites eligible participants to register and join the auction.
When:
The auction starts Wednesday, June 19, 2019 at 6PM Singapore Standard Time.
Where:
The auction will be conducted on the Algorand blockchain at this url.
How:
To participate in the first Algorand Foundation auction, users need to complete the registration process, including creating an account, completing compliance checks, funding a USD wallet and creating or importing an Algo wallet. These steps may take several days to finalize before bidding is possible. To begin the registration process, click here.
Please note that participation in auctions is not possible for the following excluded jurisdictions: United States of America and its territories, Canada, Democratic People’s Republic of Korea, Cuba, Syria, Iran, Sudan, Republic of Crimea, People’s Republic of China, and jurisdictions in which the auctions and/or trading of the tokens themselves are prohibited, restricted or unauthorized in any form or manner whether in full or in part under the laws, regulatory requirements or rules in such jurisdiction.
About Richard Kastelein
Founder and publisher of industry publication Blockchain News (EST 2015), a partner at ICO services collective Token.Agency ($750m+ and 90+ ICOs and STOs), director of education company Blockchain Partners (Oracle Partner) – Vancouver native Richard Kastelein is an award-winning publisher, innovation executive and entrepreneur. He sits on the advisory boards of some two dozen Blockchain startups and has written over 1500 articles on Blockchain technology and startups at Blockchain News and has also published pioneering articles on ICOs in Harvard Business Review and Venturebeat. Irish Tech News put him in the top 10 Token Architects in Europe.
Kastelein has an Ad Honorem – Honorary Ph.D. and is Chair Professor of Blockchain at China’s first Blockchain University in Nanchang at the Jiangxi Ahead Institute of Software and Technology. In 2018 he was invited to and attended University of Oxford’s Saïd Business School for Business Automation 4.0 programme.  Over a half a decade experience judging and rewarding some 1000+ innovation projects as an EU expert for the European Commission’s SME Instrument programme as a startup assessor and as a startup judge for the UK government’s Innovate UK division.
Kastelein has spoken (keynotes & panels) on Blockchain technology in Amsterdam, Antwerp, Barcelona, Beijing, Brussels, Bucharest, Dubai, Eindhoven, Gdansk, Groningen, the Hague, Helsinki, London (5x), Manchester, Minsk, Nairobi, Nanchang, Prague, San Mateo, San Francisco, Santa Clara (2x), Shanghai, Singapore (3x), Tel Aviv, Utrecht, Venice, Visakhapatnam, Zwolle and Zurich.
He is a Canadian (Dutch/Irish/English/Métis) whose writing career has ranged from the Canadian Native Press (Arctic) to the Caribbean & Europe. He’s written occasionally for Harvard Business Review, Wired, Venturebeat, The Guardian and Virgin.com, and his work and ideas have been translated into Dutch, Greek, Polish, German and French. A journalist by trade, an entrepreneur and adventurer at heart, Kastelein’s professional career has ranged from political publishing to TV technology, boatbuilding to judging startups, skippering yachts to marketing and more as he’s travelled for nearly 30 years as a Canadian expatriate living around the world. In his 20s, he sailed around the world on small yachts and wrote a series of travel articles called, ‘The Hitchhiker’s Guide to the Seas’ travelling by hitching rides on yachts (1989) in major travel and yachting publications. He currently lives in Groningen, Netherlands where he’s raising three teenage daughters with his wife and sailing partner, Wieke Beenen.
Visit Website
Toronto’s Creative Destruction Lab Joins Facebook’s Libra Association – June 19, 2019
Australia’s Power Ledger to bring P2P Energy Trading to Austria – June 19, 2019
Litecoin Foundation Teams Up With Bibox And Ternio On Special Edition Litecoin Debitcard – June 19, 2019
US Legislators Call for Halt to Facebook’s Cryptocurrency Plans – And to Face Congress – June 19, 2019
Hyperledger Welcomes Diverse Line-up of New Members for Identity Project – June 18, 2019
Facebook Announces Calibra Digital Wallet and Libra Cryptocurrency – June 18, 2019
MIT Professor Silvio Micali’s Algorand Project Announces Date for First Auction – June 6, 2019
SEC sues Canadian Messaging Service Kik Selling Unregistered Securities in its $100 million ICO – June 4, 2019
CrowdEngine Teams Up With Polymath Adding Security Token Issuance to List of Crowdfunding Services – June 3, 2019
Bitcoin Inc. CEO  Morgan Rockcoon Busted – 21 Months in Prison and Fines – June 3, 2019
Reshaping a Nation’s Logistics Sector: Singaporean Blockchain Company PLMP Cuts Multimillion Dollar Deal With Indonesia – June 3, 2019
Swisscom TV Opens Blockchain-based Art Gallery – June 3, 2019
Canadian Messaging Service Kik Burns $5 Million Fighting SEC Over Utility Tokens, Now Raising $5 Million More to Continue Battle – June 3, 2019
Salesforce Introduces CRM Blockchain Platform For Selected Clients – June 3, 2019
Block.one Plans to Tackle Pitfalls of Social Media With New Blockchain Community Called “Voice” – June 3, 2019
CasperLabs and Ethereum on Proof of Stake Protocol at CryptoChicks in Toronto, family ran Hack-A-Thon for blockchain family – June 3, 2019
World Economic Forum Inaugurates Global Blockchain Council to Address Lack of Well-Defined Rules for Working with Blockchain – May 29, 2019
India’s First Blockchain Powered VoD Platform ‘myNK’ Launches At Cannes Film Festival – May 27, 2019
New Zealand’s Cryptopia Exchange Packs It In – Announces Liquidation – May 15, 2019
Canadian Blockchain Expert Alex Tapscott and NextBlock Global Limited to Pay $1M for Misleading Investors – May 15, 2019
View All Articles
Also published on Medium.
Related
Source link
Source/Repost=> http://technewsdestination.com/mit-professor-silvio-micalis-algorand-project-announces-date-for-first-auction/ ** Alex Hammer | Founder and CEO at Ecommerce ROI ** http://technewsdestination.com
0 notes
decisionsondemand-blog · 7 years ago
Text
Yes, There Is a Better Way to Assign Leads in Salesforce
Tumblr media
How quickly do your sales reps respond to new web leads?
In 2007, the Lead Response Management Study by Dr. James Oldroyd reported that the recommended time for following up on a lead is five minutes. Nine years later, InsideSales.com published Best Practices for Lead Response Management based on Oldroyd’s study. The five-minute rule still holds true. However, the average sales rep takes much longer than the recommended time to respond to leads.
Clearly, it is critical to respond quickly to web inquiries. According to a study from InsideSales.com, 78% of sales go to the company that responds to an inquiry first. And yet, the average time a company takes to respond to a lead is 40 hours. This gap offers a chance to get ahead of the competition by ensuring your lead assignment process is automated, fast, and accurate.
There is no one right way to assign leads. The best approach depends on a company’s product and customer types. To complicate the process, speed is not the only criterion for lead assignment. It’s also important to consider factors like territory alignment, existing relationships with a customer account, skills and availability of the assigned rep, and fairness in lead distribution.
Sending Leads to the Right Rep in Salesforce
Salesforce includes built-in Lead Assignment rules that are useful for small sales teams to get started. But the rules have limitations. They are not enough to support your organization as it grows and the rules become more complex. This image shows an example of basic Salesforce lead assignment rules.
Mid- to large-sized companies need many more rules, and entering those rules becomes a time-consuming process. In fact, just managing a few dozen rules is painful.
For example, you might use criteria like these to funnel leads to your sales reps:
·        Location
·        Industry
·        Company size or revenue
·        Product interest
·        Lead quality
Anne, an awesome rep, handles leads from companies in the healthcare industry that have revenues of $100 million or more and are located in the Los Angeles and San Francisco Bay areas.
Implementing that rule would involve entering multiple criteria, including a list of ZIP codes or county names. The image below shows an example of more complex lead assignment rules using revenue, industry, and location as the criteria. It’s quite typical for a mid-size company to create hundreds, sometimes thousands of rules. This is especially true with fine-grained territory assignments based on ZIP or area codes, or for companies doing business internationally.
Entering complex rules with point-and-click is cumbersome and prone to errors. To make matters worse, there is no easy way to test the rules to see if they work correctly before activating them. The people who define the rules — typically, someone in sales operations or the VP of sales — may not have the tech skills and access to validate them. This results in miscommunication and more overhead.
If reps receive the wrong leads or leads remain unassigned, there’s no way to figure out what happened. The system doesn’t keep logs for assignments or an audit trail for rule changes. It would be valuable to have a monitoring and reporting tool to stay on top of what’s happening, identify any stray leads, and take action so leads aren’t stranded.
Balancing the Load
Remember the five-minute rule? Anne may be the best rep for XYZ criteria, but it doesn’t help the company if she’s bombarded with too many leads and can’t quickly follow up on them. Even if she has a balanced number of leads, what happens if she is at a meeting or out of the office when a lead comes in? The clock is ticking, and the lead is sitting in Anne’s queue.
That’s where Round Robin and Load Balancing can help. Instead of assigning leads to individual reps, they can be distributed to members of a group. Distribution strategies can vary from a basic Round Robin to Weighted Load Balancing. They can be combined with assignment caps, checks for absent users, or skills-matching.
Complexity Increases as the Number of Sales Reps Grows
A company with dozens of reps has people coming and going often enough that the lead assignment rules will need frequent updates. On top of that, territories periodically change or get re-aligned.
Assigning the right person to a lead also depends on the current products and business lines that are offered. New ones get added, and old ones are removed. These changes occur on a regular basis in large companies. Acquisitions happen less often — but when they do, they will affect the lead assignment process.
Another factor to consider is a team-selling setup in an internal sales organization. You may also have partners, such as resellers and solution providers, added to the mix. In this case, you may assign to both an internal rep and a partner, or you can choose just one of them.
How to Turn Complex Lead Assignment Rules into an Easy-to-Manage Process
In summary, you want a lead assignment program that achieves the following goals:
·        Qualify the lead to determine which sales rep or team is the right one.
·        Verify the sales rep is available.
·        Ensure that each rep has a fair number of assignments.
·        Confirm that someone follows up with the leads quickly.
·        Make changes to the rules easy, fast, and reliable.
·        Monitor lead assignments with reporting and logs.
The assignment rules in Salesforce are not designed to manage this process, and Salesforce does not offer Round Robin or Load Balancing capabilities. Fortunately, the App Exchange has an app like Decisions on Demand to give you powerful tools to address these challenges. After setting up the rules, you can test them with a built-in Test Console. The unique Excel import/export capability makes it easy to update large rule sets containing thousands of rules. It includes useful distribution options like Round Robin and Load Balancing with weights, caps, and skills-matching.
Decisions on Demand for Salesforce boosts close rates, cuts overhead, and reduces errors. With this App Exchange app, your team can turn lead assignment into a completely automated process.
0 notes
a-breton · 7 years ago
Text
How to Plan a Year’s Worth of Content With One Original Research Survey
When you look at the library of content you publish, is it a string of somewhat-related blog posts, videos and more — or do all the pieces work together to tell a better, broader story?
Of course (to poorly paraphrase Robert Rose), you want your editorial to tell one story instead of each piece being disconnected from the rest.
Your editorial should tell 1 story instead of many disconnected pieces, says @MicheleLinn. Click To Tweet
While there are many ways to do this, one approach that works well is to use a survey-based research project to bring focus to all of your editorial. That research can serve as your keystone from which spring many assets and related stories.
This article walks you through the steps of designing and publishing survey-based original research to make it the cornerstone for your editorial plan.
But, before I begin, an important note: The goal of your research is to be able to tell a compelling story validated with data. The quality of your data is, of course, important, but it needs to relate to a broader story. Constantly ask yourself: Why will someone care?
HANDPICKED RELATED CONTENT: Thinking of Creating Original Research? 8 Things to Consider
Step 1: Choose a topic
Spending time deciding on the area of focus is even more important when you plan to use your survey as your editorial glue.
The best research topics check three boxes. They need to:
Interest your audience (Note: This, of course, requires you to define your audience. Customer? Prospects? Media or influencers?)
Align with your brand story
Focus on an area not yet covered with research
If you are in a new space or trying to create a new category, an original research project on the state of the industry could be ideal. YOU become the source of authority and you are what people link to because you have the stats — especially if you repeat this study annually to show trends.
If you are in a crowded industry, such as, ahem, content marketing, focus on a niche. What is that thing you want to be known for?
This is a challenge Brody Dorland and team faced when embarking on their research project in 2017. Their tool, Divvy HQ, helps content marketers, but as Brody explains:
We did not want to do a state of content marketing report because others had already done so. Instead, we decided to focus specifically on content planning, which is something that had not been covered – and it’s something our business directly helps marketers with. This research was a way for us to better understand the challenges our customers face, validate the direction of our product roadmap, and provide insights that marketers can use to benchmark their own content planning process.
TIP: Answer this question: How do you want your audience to think or act differently as a result of reading this research?
Do you want to validate current thinking? Challenge a belief or assumption? Reveal an opportunity? Keep your reasoning top of mind.
Example: Let’s say you work for a workflow software company, and you want more content marketing teams using your platform. Studying team productivity is too broad, so you focus your research on how content marketing teams operate and whether their processes are working.
Step 2: Pinpoint the survey ‘dimensions’
Once you know your general topic, identify the specific topics you want your research to cover. I call these areas of focus “dimensions” — the key categories you want the research to study. Think of these dimensions as a table of contents. You can see how CMI’s annual content marketing research easily falls into a table-of-contents format.
Think about a table of contents when you structure your #research dimensions, says @MicheleLinn. Click To Tweet
Dimensions serve as a way to organize your thinking, prioritize the questions you want to ask, and provide structure for your data analysis. (I’ll explain all of these steps in more detail as we go through the process.)
Example: Continuing the workflow platform solution company example, to study content marketing teams, you choose these dimensions:
Team composition: Which people and skills do people have on their teams? Are people in-house or outsourced?
Communication: How are teams communicating with each other — and what’s working? 
TIP: The example only looks at a subset of topics. As a rule of thumb, identify three to five dimensions.
Step 3: Hypothesize your story for each dimension
Once you know the key dimensions, hypothesize what the results will tell you.
Now, this is important: You aren’t designing the research to end up with a specific angle. You are using your research to test your hypothesis.  
If the results differ from what you expect, that’s OK. In fact, that may be a story.
It’s OK if your #research findings differ from your hypotheses. That could even be a story, says @MicheleLinn. Click To Tweet
CoSchedule embraced this hypothesis vs. results idea in its State of Marketing Strategy in 2018. I love how the team details what it expected the data to show and the actual findings.
Example: The chart illustrates the hypotheses for each dimension established for the workflow software company.
Step 4: Draft questions to test your hypotheses
Next, draft the questions for each dimension to test your hypotheses.
How the questions are asked is incredibly important. (If you are unfamiliar with survey design, consider getting help with this step of the process, including these pointers.)
Example: Add a column to the table for “possible questions.” As you see, the questions you ask will help you determine whether your hypothesis is correct.
TIP:  Ask only questions that will provide insight. Continually ask, “How will I use the data from this question?” If you are uncertain, chances are you don’t need to ask the question.
Step 5: Identify key segments for comparison
If you want to do comparisons, looking at the data through that lens will offer more opportunities to tell the story in a nuanced and useful way. Salesforce does a great job comparing segments in its research, as you can see in State of Service.
TIP: Any segment you report on needs to have an adequate sample size. While there is no hard and fast rule as to what constitutes an adequate sample, aim for at least 100 participants.
Example: With the workflow software company research, the segmenting goal is to understand the differences between marketers who consider themselves productive versus those who do not. You also could compare the habits between those who use a workflow management tool versus those who rely on email.
Step 6: Analyze the findings by dimension and segment
Once you have the data, your goal is to identify the story. As Rachel Haberman, content marketing manager at Skyword, told me:
Don’t fall into the trap of simply presenting your data as is. Find the story behind your data – that’s what’s meaningful. Numbers alone are forgettable, but when you use them to tell a story, they reinforce an emotional appeal with data-driven evidence. It’s the best of both worlds.
Find the story behind your data – that’s what’s meaningful, says Rachel Haberman @Skyword. Click To Tweet
Because you have gone through these steps, it will be easier to identify what your story is. Go back to your dimensions and organize the results in these categories. Does the data support your hypothesis or were you surprised?
Keep a list of the insights and ideas you uncover, looking at the dimensions as a whole and through individual questions. You can use these later.
Step 7: Create a home base for your findings
When it’s time to publish your findings, decide on one spot on your website where you will consistently point all traffic from your research. Not only does this keep things simple, but it is key to help get backlinks, which is a huge benefit of using your research in multiple ways.  (In fact, as Aleh Barysevich uncovered in the study SEO PowerSuite’s Link Building in 2017, SEO professionals consider research to be the most efficient type of content for getting backlinks.)
Publish findings in a dedicated spot on your website & consistently point all traffic to it. @MicheleLinn Click To Tweet
If you aren’t familiar with backlinks, understand this: the more backlinks to a page, the more authority it will have and the more likely it is to rank in search engines. Yet, our research with BuzzSumo found only 49% of marketers earn backlinks from their research — a missed opportunity.
Your research home base can take any number of forms, but if you’re getting started, create a blog post that details your findings, such as this one from Orbit Media’s 4th Annual Blogger Study or this one from Buffer covering The State of Remote Work. Alternatively, consider a landing page like this one from Upwork.
Step 8: Brainstorm the type of content you can create
You can create many types of content from your research findings, including:
Doing more with your research is a substantial opportunity. Not only will this help you get backlinks (remember to point to your home base), but it also helps you tell a more cohesive story.
If you are at a loss of what to do with your research, consider these 13 examples. I’ll get specific in the next step, but it’s helpful to understand the general type of content you can create:
SlideShare presentations
Blog posts on your website
Articles on other websites
Additional reports
Webinars and presentations
Infographics
Standalone data graphics
Video of high-level findings
Video series digging into findings
Online assessment
Podcasts
Twitter chat
Gated guides and e-books
HANDPICKED RELATED CONTENT: 10 (Mostly) Quick Wins to Steal for Your Original Research Project
Step 9: Track your story ideas
Armed with the general ideas of the type of content you can create as well as the insights you documented when you were analyzing your results (see Step 6), it’s time to get specific.
I use this template to track my brainstorming and story ideas from a research study.
Of course, the template can be customized, but I track:
Content type: Blog post, guest post, webinar, etc. (See ideas in Step 8.)
Content idea: What is the basic idea? Briefly explain why someone will care about this.
Data point: What stat(s) does this story relate to?
Notes: What else would you or someone reviewing this benefit from knowing?
Publication/platform: Where will you publish?
Priority: As your ideas grow, which ones do you want to execute first?
Owner/responsibility: Who will oversee this?
Attachments: What links or files do I need?
As part of this process, you also can include a list of content to be updated with a link to your research. For example, have you published any blog posts that would benefit from one of the stats?
TIP: I use Airtable for this, but you can use a spreadsheet or other format to capture your thoughts. In Airtable, you can create a tab that stores all the stats. You can link the stat to the story — and in turn, when you view all stats, you’ll see the stories using them.
Step 10: Incorporate your ideas into your editorial calendar
As DivvyHQ uncovered in its annual content planning survey, only a quarter of marketers plan several months in advance. You may feel overwhelmed by the idea of planning out a year’s worth of content with one survey.
While your process and cadence will vary, consider these ideas to make it manageable:
Focus in depth on one dimension each quarter. (You could even consider each dimension as a topic cluster.)
Determine a regular cadence to publish blog posts that delve into research findings, such as one article per month.
Have a library of data graphics to share on social. When possible, link to content that talks about the story behind the data.
Look at the presentations in the coming year and see if any of them can incorporate a tie-in to your data.
Share data graphics on social & link to #content that talks about the story behind data. @MicheleLinn Click To Tweet
Even if you are an ad-hoc planner (is that an oxymoron?), your idea tracker will still be incredibly valuable for your content planning. Continually refer to it as you decide which content to tackle next.
HANDPICKED RELATED CONTENT:
Want Content That’s More Usable & Reusable? Chunk It
Editorial Calendar Tools and Templates
Summary
By using the system above, your research can become the glue that holds together your editorial. Remember, if you plan well, you can create a survey from which you can tell many stories in many ways.
I’d love to hear from you. Do you have any examples of brands who are using their research to tell rich, continual stories? Share in the comments.
Please note: All tools included in our blog posts are suggested by authors, not the CMI editorial team. No one post can provide all relevant tools in the space. Feel free to include additional tools in the comments (from your company or ones that you have used).
You can learn more about how research is the unsung hero in content marketing from Michele Linn during her presentation at Content Marketing World Sept. 4-7 in Cleveland, Ohio. Register today and use the code BLOG100 to save $100. 
Cover image by Joseph Kalinowski/Content Marketing Institute
from http://bit.ly/2KhY1lO
0 notes