#Table Header Cell Error
Explore tagged Tumblr posts
shawnjordison · 2 years ago
Text
Table Header Cell Has No Associated Sub Cells
Learn how to resolve the 'Table Header Cell Has No Associated Sub Cells' error in PAC 2021 and enhance PDF accessibility! #PDFUACompliance, #PAC2021, #TableHeaderCellError, #AdobeAcrobat, #WCAG, #AssistiveTechnology
Welcome to an in-depth exploration of a specific issue related to PDF/UA compliance testing, known as the ‘Table Header Cell Has No Associated Sub Cells’ error. This error often comes up when using the PAC 2021 checker, and today, we’re going to walk you through how to address it. Video Overview This video will provide an overview of this post but in video format. Understanding the ‘Table…
Tumblr media
View On WordPress
0 notes
fugu-in-f · 1 year ago
Text
it's fucking terrible garbage
a corporate build-it-yourself comedy of errors
a pointlessly large aggregation of stuff that does not work and stuff that companies rebuild that does not work
oh my god if it builds a table and there's a clickable link in a table cell, it's not actually a table cell in the middle of the table, that's a table header and that table cell does not exist
sometimes it makes its own browser subtabs and sometimes it opens its own browser page as subtabs and then you've got three instances of Salesforce nested inside your Salesforce webpage
its search features sometimes just gives up when you're searching for a specific string or number
its duplicate rules checking is arcane and asinine
oh, you loaded a page too fast? waow, component error
its permission hierarchy is incomprehensible - have a role and a permission and a permission set and a permission set group and no these do not overlap how you think they do
I test this shit for a living and I'm incrementing violent decades and maybe it's just all my terrible programmers writing garbage on the platform, but it's awful and I hate it
what the fuck is a salesforce. we're briefly alive for a few violent decades
29K notes · View notes
advancedexcelinstitute · 30 days ago
Text
Top 10 VBA Macro Examples to Boost Productivity in Excel
Tumblr media
In today’s data-driven world, Excel remains a powerhouse for professionals in finance, operations, marketing, and beyond. But even Excel has its limits—especially when it comes to repetitive, manual tasks. That’s where VBA macros come in.
Using Excel automation, you can streamline processes, reduce errors, and save hours of work each week. Whether you’re new to VBA or just looking for practical examples, this guide shares 10 powerful, real-world macros that will supercharge your Excel productivity.
1. Auto-Format Reports with One Click
If you often deal with raw data dumps or monthly reports, formatting them manually can be tedious. This macro automates common formatting tasks: bolding headers, centering text, and adjusting column widths.
Why it’s helpful: Clean, professional reports in seconds. Ideal for managers and clients who expect consistency.
2. Highlight Duplicate Values Automatically
Cleaning up data often means identifying duplicate entries. While Conditional Formatting does the job, automating it with a macro allows for instant results across any selected range.
Use it when: You’re validating lists like customer names, product IDs, or transaction records.
Excel automation benefit: You reduce human error and accelerate your data-cleaning process.
3. Send Emails Directly from Excel
Need to send personalized emails based on Excel data? With VBA, you can automate email generation via Outlook. Whether it’s for client updates, performance summaries, or billing reminders, this macro pulls values from cells and drafts messages automatically.
Example use case: Pulling the email address, name, and message from columns and sending batch emails without ever opening Outlook manually.
4. Auto-Save Workbook Every Few Minutes
Data loss can happen in a split second—especially during power outages or software crashes. This macro schedules automatic saves every few minutes, giving you peace of mind during large projects.
Excel automation advantage: Focus on your work without the constant fear of losing unsaved changes.
5. Rename Multiple Sheet Tabs
If you’re working with workbooks that have multiple sheets (for clients, months, or departments), renaming them manually is inefficient. This macro reads from a list and renames your sheets instantly.
Use case: Monthly budgeting sheets like Jan, Feb, Mar… or departments like Sales, HR, IT.
6. Convert Numbers to Words
Sometimes, numbers alone aren’t enough—especially when printing checks, invoices, or legal documents. This macro spells out numbers in plain English.
Example: 1,250 becomes “One Thousand Two Hundred Fifty.”
Pro tip: You can adapt the script for different currencies or regional formats.
7. Password-Protect All Worksheets
If you’re sharing workbooks that contain sensitive information, protecting each sheet with a password manually can be a hassle. This macro locks all sheets in a single click.
When to use: Reports for clients, financial data, or HR records.
Bonus: You can also create a companion macro to unprotect all sheets with one click.
8. Unhide All Sheets at Once
It’s common to hide worksheets for organizational or confidentiality reasons. But when it’s time to audit or review your workbook, manually unhiding sheets is slow. This macro reveals all hidden tabs in one go.
Why it matters: Speeds up auditing and avoids missing critical data tucked away in hidden sheets.
9. Automatically Create Pivot Tables
Pivot tables are a core tool for data analysis, but setting them up repeatedly can waste time. This macro generates a pivot table from a selected range—instantly.
Where it helps: Weekly sales summaries, inventory reports, or regional comparisons.
Excel automation boost: Eliminates setup time and enforces consistent layouts across reports.
10. Multi-Replace Tool (Find and Replace at Scale)
When you need to replace multiple values in bulk, doing it manually with Ctrl+H is inefficient. A macro can handle dozens of replacements in a single run.
Use case: Standardizing product names, cleaning inconsistent spellings, or updating brand terminology.
Why You Should Use VBA Macros for Excel Automation
Macros extend Excel’s functionality far beyond formulas and charts. Here’s what makes them essential for professionals and teams:
Save Time: Automate repetitive tasks like formatting, emailing, or data updates.
Improve Accuracy: Reduce manual errors by letting Excel handle logic-driven operations.
Enhance Reporting: Build clean, consistent reports with minimal manual effort.
Boost Collaboration: Share macros with teams for standardized processes.
VBA empowers you to turn Excel into a dynamic application tailored to your exact needs.
Getting Started with VBA (Even If You’re New)
You don’t need to be a developer to use VBA. Here’s how to get started:
Enable the Developer Tab
Go to File > Options > Customize Ribbon > Check “Developer”.
Use the Macro Recorder
Record repetitive actions to auto-generate VBA code.
Create a Module in the VBA Editor
Press Alt + F11, insert a module, and paste your macro code.
Assign Macros to Buttons
Make them clickable for non-technical users.
Save as Macro-Enabled Workbook (.xlsm)
Regular .xlsx files can’t store macros.
Pro Tips for Success
Always test macros on backup copiesEspecially for data-modifying tasks like deleting rows or sending emails.
Use comments in your codeIt helps you and your team understand what the macro does at a glance.
Modularize your codeBreak complex macros into small, reusable procedures for clarity and maintenance.
Final Thoughts
Learning how to use VBA macro examples is one of the smartest moves you can make if you regularly work in Excel. Whether you’re managing data, building reports, or communicating results, Excel automation frees you from the grind of repetitive tasks.
These 10 examples are just the beginning. As your skills grow, you’ll be able to build powerful workflows tailored to your exact needs—and even help your team or organization work more efficiently.
Want to get even more done with macros? Stay tuned for our upcoming guide on advanced VBA automation with userforms, looping logic, and workbook events.
FAQs (Frequently Asked Questions)
Q: What is a VBA macro in Excel?
A: A VBA macro is a small program written in Visual Basic that automates tasks in Excel. Macros can format data, generate reports, send emails, and more—all with a single click.
Q: Are VBA macros safe to use?
A: Yes, as long as you trust the source. Always review the code before enabling macros in a file.
Q: Do I need coding skills to use macros?
A: No. You can use the built-in Macro Recorder or copy/paste ready-made scripts like the ones in this article.
Q: Can VBA be used for data analysis?
A: Absolutely. Macros can generate pivot tables, filter data, and even create charts based on logic.
Q: How do I run a macro in Excel?
A: Go to the Developer tab > Macros > Select the macro > Click “Run”. Or assign it to a button for easier access.
0 notes
fabanalytics · 2 months ago
Text
Common Mistakes in Excel Financial Reporting and How to Avoid Them
Tumblr media
Businesses continue to use Excel to maintain and share their financial data. However, even with its popularity, there are common errors that occur in the analysis of accounting figures. In this chapter, let us look at some of the most common mistakes to avoid when doing your financial reports in Excel.
1. Formula Errors
Generally, errors in formulas are hidden but are significant. Mistakes in cell reference or improper use of formulas could lead to manipulations in the financial analysis and, therefore, affect the accuracy of financial reporting. For instance, a single data misinterpretation results in entire columns of wrong numerical values being returned.
How to Avoid: Count through each formula and check it with Excel to ensure the cell’s dependency with the use of the “Trace Precedents” feature. Remember to always cross-check formulas, especially when copying data to a new row or even columns.
Inconsistent Data Formatting
Different data formats that include dates or currency that are not properly calculated can be confusing to the software. This is especially true when financial reporting involves, once again, the requirement for uniform data presentation.
How to Avoid: Format is an essential element that is required to keep data consistent across the report. To avoid formatting problems, one should use the ‘Format as Table’ feature in Excel to apply the same formatting for the entire table.
Lack of Data Validation
Forgetting to validate data leads to errors and duplications in the reports. This is bad since the financial models will not be as accurate as they should be, and since figures are important to business, the wrong figures can impact business insights.
How to Avoid: Use Excel, “Data Validation,” to limit entries into a particular criterion to minimize incorrect data inputs. If you perform this kind of analysis daily, you will be able to check for duplicates or outliers and, therefore, get cleaner data and improved financial reporting in Excel.
Ignoring Dynamic Ranges
Another mistake often used in formulating the formulas is hard coding cell ranges. As the data range expands with time, the cell ranges included in the formula may not yield the expected results. This means that fixed ranges fail to include new entries, which, therefore, results in incomplete analysis and skewed insights.
How to Avoid: Utilize dynamic ranges or Excel tables that automatically adjust to present data upon entering new data. Such tools help you maintain the accuracy of your financial reporting in Excel, whether expanding or modifying datasets.
Poor Documentation
When there are no documents, then it becomes difficult for people to know what they want if they are to prepare a complicated report. This causes confusion when working in a team, especially when one person is updating or even interpreting the financial statistics. Proper labeling and description play an important part in financial reports.
How to Avoid: Use annotation inside sheets more often, giving headers, labels, and comments for main calculation and considered assumptions. These notations enable other people to follow the structure and reasoning of each section of your financial reporting in Excel.
Conclusion
Enhancing financial reports doesn’t simply include the process of data input but also solutions for common mistakes. FAB Analytics provides you with all the information you need to make your Excel reports precise, coordinated, and trustworthy. When practicing these best practices, your use of Excel in generating your financial statements will be informative, making it a core component in the decision-making system.
Source url: https://fabanalytics.blogspot.com/2024/11/common-mistakes-in-excel-financial.html
0 notes
biglybt · 3 months ago
Text
3.8 Release
There are lots of new features, enhancements and fixes, as you can see from the various beta release posts: https://biglybt.tumblr.com/tagged/BiglyBT3800
Windows users please read https://github.com/BiglySoftware/BiglyBT/wiki/Installation if you have SmartScreen installation issues.
For summary, here is a list of the new features:
Added Tag constraint function "isFriendFP()" Enable filtering in Tags discoveries Added download/upload rate limits to simple-api plugin Improve RSS feed entity fudger Added option to put DND data in different folder Added timed rotation option to increase minimum seeding time via SP ratio Added option to not add new download if previously completed Added torrent version option to share config Added option to pause downloads during backups Increase MOC recent location limit Add search results name tooltip for subscription hits Support dynamically constructed table cell content Support multi-column sub-row sort Added last update, next update and error tooltip to subscription header Support multi-downloads in peers sub-tab Double click middle mouse button to pop-out sidebar entry Added row-details viewer Added date-added column to my-shares Added "pause" toolbar icon Remember FilesView filters Some people like their files indexed from 1. Visual option only Filter intermediate Files View nodes when no visible kids Add search result description to site column tooltip as well Added URL column to subscriptions; support column filters Prompt if torrent has suspicious file name extension Added a networks icons column to library Added menu item to clear cached tracker peers Added "actions" subscription column; removed view button Added options menu to chat overview sidebar Added edit-http-seeds to sources view Added clear-peer-cache to sources view Show availability value in column for uploading pieces
0 notes
sheetnerds · 8 months ago
Text
Excel Techniques: Mastering Excel with SheetNerds
Microsoft Excel is an incredibly powerful tool that, when used effectively, can help you organize data, perform complex calculations, and automate tasks to save time. At SheetNerds, we’re dedicated to providing expert tips and techniques for mastering Excel. Whether you're new to Excel or a seasoned user, there's always something new to learn. In this guide, we'll cover seven essential Excel techniques that will help you unlock the full potential of this amazing tool. Ready to level up your skills? Let’s dive in.
Tumblr media
1. Leveraging Excel Formulas for Efficiency
Excel formulas are the backbone of efficient data analysis. With the right formula, you can quickly manipulate large datasets, perform calculations, and draw insights. The key to success in Excel lies in mastering a few core formulas.
Essential Excel Formulas to Know
SUM(): Adds values together.
AVERAGE(): Calculates the average of a set of numbers.
IF(): A logical function that returns a value based on a condition.
VLOOKUP(): Looks for a value in a vertical table.
INDEX MATCH: A more flexible alternative to VLOOKUP.
The SUM function is one of the simplest yet most powerful tools in Excel. It allows you to add up values from different cells. For example, =SUM(A1:A10) will return the sum of all the values in cells A1 through A10. This can be particularly useful for managing budgets, sales figures, and other numerical data.
Similarly, IF statements allow you to introduce decision-making logic into your sheets. If you need Excel to return a specific value based on a condition, you can write something like =IF(A1>10,"Over 10","10 or less"). This formula tells Excel to check if the value in A1 is greater than 10 and, if so, return "Over 10". Otherwise, it returns "10 or less."
Formula Tips for Faster Workflow
Use keyboard shortcuts: You can insert formulas quickly by pressing Alt + = for an auto sum.
Double-check ranges: Always ensure you're referencing the correct cell ranges to avoid errors in your calculations.
Mix formulas for advanced functionality: Combine formulas like IF and SUM for more complex calculations, e.g., =IF(A1>10,SUM(A2:A5),0).
2. Automating Repetitive Tasks with Macros
Excel Macros allow you to automate repetitive tasks, saving you significant time. A macro is a script or a series of instructions that Excel can execute automatically. Whether it’s formatting data, generating reports, or running calculations, macros simplify your workflow.
How to Create a Macro
Navigate to the Developer tab (if not visible, enable it through Excel Options).
Click on "Record Macro."
Perform the tasks you want to automate (Excel will record these actions).
Stop the recording when done.
Your macro is now ready to use.
For example, you could record a macro to format an entire spreadsheet—apply bold headers, align text, and set number formats—in just a few clicks. This technique is especially useful for tasks you perform regularly, such as generating weekly reports.
Best Practices for Macros
Plan your steps carefully: A well-thought-out sequence ensures that your macro runs smoothly.
Test your macro on a sample dataset: This ensures it works as expected before using it on important data.
Document your macros: Add descriptions so that you or your team members can easily understand what the macro does in the future.
Advantages of Using Macros
Increased productivity: Automating repetitive tasks frees up time for more critical work.
Consistency: Macros ensure that repetitive tasks are performed exactly the same way each time.
Scalability: With macros, you can apply actions to large datasets in seconds, regardless of the dataset size.
3. Advanced Data Analysis with Pivot Tables
Pivot tables are one of Excel’s most powerful features, allowing you to quickly summarize large datasets. They enable you to sort, filter, and group data into meaningful reports without needing complex formulas.
Creating a Pivot Table
Select the data range you want to analyze.
Go to the Insert tab and choose "PivotTable."
Choose where you want the PivotTable to appear (new worksheet or existing worksheet).
Drag and drop the fields into Rows, Columns, and Values to structure your report.
With a few simple clicks, you can transform hundreds or thousands of rows of data into a meaningful summary. For example, a sales report showing total sales by region or by product category can be easily created in seconds using a PivotTable.
Customizing Pivot Tables
Filters: Use filters to focus on specific data points, such as sales figures for a particular time period.
Grouping Data: Group data by custom date ranges or other criteria to simplify complex datasets.
Calculated Fields: You can create custom calculations in a Pivot Table that aren’t present in the original data, allowing for deeper insights.
Benefits of Using Pivot Tables
Quick summarization of large datasets.
Interactive reports that can be adjusted on the fly.
Flexible analysis by allowing multiple ways to view the same data.
Here are 10 frequently asked questions (FAQs) about Excel techniques:
1. How can I quickly sum up a column or row in Excel?
You can use the SUM function to quickly add up numbers in a column or row. For example, to sum values in column A, use the formula:
excel
Copy code
=SUM(A1:A10)
Alternatively, you can use the AutoSum feature by selecting the range and clicking the AutoSum button on the toolbar.
2. What is the fastest way to remove duplicates in Excel?
To remove duplicates:
Select the range of cells.
Go to the Data tab.
Click Remove Duplicates.
Choose the columns from which you want to remove duplicates, and click OK.
3. How can I freeze the top row or the first column in Excel?
To freeze the top row:
Click on the View tab.
Select Freeze Panes.
Choose Freeze Top Row.
To freeze the first column, follow the same steps and choose Freeze First Column.
4. What is conditional formatting and how do I use it?
Conditional formatting allows you to format cells based on specific conditions. To use it:
Select the cells you want to format.
Go to the Home tab and click Conditional Formatting.
Choose a rule, such as highlighting cells greater than a certain value, and apply it.
5. How do I combine text from multiple cells into one in Excel?
You can combine text using the CONCATENATE function or the & operator. For example, to combine text from cells A1 and B1, use:
excel
Copy code
=A1 & " " & B1
This will combine the text with a space between them.
6. How do I create a drop-down list in Excel?
To create a drop-down list:
Select the cell where you want the list.
Go to the Data tab.
Click Data Validation.
In the Allow box, select List.
Enter the values you want to appear in the drop-down list, separated by commas.
7. How can I split a cell's content into multiple columns?
To split content into multiple columns:
Select the cell range.
Go to the Data tab.
Click Text to Columns.
Choose the delimiter (such as comma or space) and follow the wizard to separate the data.
8. What is VLOOKUP and how do I use it?
VLOOKUP is a function used to search for a value in the first column of a table and return a value in the same row from another column. The syntax is:
excel
Copy code
=VLOOKUP(lookup_value, table_array, col_index_num, [range_lookup])
Example:
excel
Copy code
=VLOOKUP(A1, B2:C10, 2, FALSE)
This searches for the value in A1 in column B and returns the value in the second column (C) of the same row.
9. How can I apply a formula to an entire column in Excel?
To apply a formula to an entire column, type the formula in the first cell of the column, then double-click the small square (fill handle) in the bottom-right corner of the cell. Excel will automatically fill the formula down the column.
10. How do I use PivotTables to summarize data in Excel?
To create a PivotTable:
Select the range of data.
Go to the Insert tab.
Click PivotTable.
In the dialog box, choose where to place the PivotTable and click OK.
Drag fields into the Rows, Columns, and Values areas to organize your summary.
These FAQs cover essential techniques to help you work efficiently with Excel.
Follow Us Blogger | Wordpress | Twitter | Gravatar | Disqus | Google Sites | Youtube | About.me
1 note · View note
tccicomputercoaching · 9 months ago
Text
10 most useful Excel Tips
Tumblr media
Here are ten useful Excel tips to enhance your productivity and efficiency:
Use Keyboard Shortcuts: Familiarize yourself with common keyboard shortcuts to save time. For example,
Ctrl + C to copy
Ctrl + V to paste
Ctrl + Z to undo
Ctrl + Arrow Keys to navigate to the edges of data regions
Conditional Formatting: Highlight important data or trends by using conditional formatting. You can set rules to change the color of cells based on their values, making it easier to spot patterns.
Pivot Tables: Summarize large datasets quickly with pivot tables. They allow you to group and analyze data without the need for complex formulas.
VLOOKUP and HLOOKUP: Use these functions to search for a value in a table and return a corresponding value from a specified column (VLOOKUP) or row (HLOOKUP).
IF Statements: Create logical tests and return different values based on whether the test is true or false. This is useful for decision-making in your data analysis.
Data Validation: Restrict the type of data or values users can enter into a cell. This helps maintain data integrity and prevents errors.
Text to Columns: Split a single column of data into multiple columns based on a delimiter (like commas or spaces). This is useful for cleaning up data imported from other sources.
Remove Duplicates: Quickly find and remove duplicate values from your data set to ensure data accuracy.
Freeze Panes: Keep the top row or the first column visible while scrolling through your worksheet. This is helpful for large datasets where headers need to remain visible.
Use Formulas Efficiently: Learn and use a variety of formulas to perform calculations, such as SUM, AVERAGE, COUNT, and more complex functions like INDEX and MATCH. Also, using absolute and relative cell references can make your formulas more flexible and reusable.
These tips can help streamline your workflow and make data management in Excel more effective.
TCCI Computer classes provide the best training in all computer courses online and offline through different learning methods/media located in Bopal Ahmedabad and ISCON Ambli Road in Ahmedabad.
For More Information:
Call us @ +91 98256 18292
Visit us @ http://tccicomputercoaching.com/
0 notes
eskill · 1 year ago
Text
Excel Skills Test: What Basic Excel Skills Employers Should Look for in Candidates?
Tumblr media
Microsoft Excel skills are indispensable for various data-centric roles like administration, accounting, and market research analysis. Proficiency levels are typically categorized as basic, intermediate, and advanced. A bachelor's degree holder should demonstrate competency in fundamental tasks such as data entry, basic formulas, and formatting at the basic level.
At the intermediate level, they should be adept at functions like VLOOKUP, pivot tables, and data validation. Advanced skills would encompass complex functions, macros, and advanced data analysis.
An Excel skills test can effectively evaluate these competencies, ensuring candidates possess the necessary proficiency for their roles.
Identifying and Using Basic Tools
Candidates should be able to identify and use the correct tools for specific tasks. This means that your Excel skills test should include tasks like setting and applying filters on column headers to organize table data, renaming and reordering sheets in a workbook, zooming in and out, and sorting columns numerically or alphabetically.
The test should also include merging and formatting cells and tables by alignment; editing font, cell border, or color; searching and replacing values in the sheet; deleting, deleting, and hiding rows or columns; merging cells and hiding words or columns, easily navigating top-level tabs like Home, Data, File and View, changing page layout and printing spreadsheets, and changing cell format to text, numerical, currency or date.
Manipulating Basic Tables
This is an important Excel skill that requires candidates to create or modify a table depending on their given task. This includes manipulating large customer data sets. Hence, the candidates should be able to use tools like sort, filter, and duplicate.
A good candidate will also be able to make tables easy to understand and analyze. So, make sure you check their ability to transpose rows to columns, format headers differently from table values, add cell borders and hide sheet gridlines for an improved look, and freeze headers differently from table values.
Charts
The candidates should be able to turn spreadsheet data into bar or pie charts to suit your company's data visualization policies and help visualize the meaning behind numbers or words.
Data Management
Sheet creation with tools to avoid errors and duplications is included in the basic Excel functions. The candidates should know how to format sheets to optimize data presentation and also know keyboard shortcuts. You also need to check whether the candidate can set up worksheets and manage data entry to minimize typos, errors, and duplicates.  
Using Basic Excel Functions
Some common Excel functions that basic users should be able to use include:
Sum: This function adds up listed values from a column, table array, row, or manual list. It can be used to calculate total costs or hours worked in a team.
If: This function checks whether a logical argument is true or not.
Average: This function calculates multiple values' average and can help check employees' average monthly income.
Min/ Max: This displays the lowest or highest number on the list and can be useful for finding the lowest or highest department.
Count: This function returns the number of cells continuing numbers in a list.  
Candidates should be able to connect cells by writing formulas and using cell references without just relying on functions. Excel formulas help identify relationships between values in the cells of a sheet and perform mathematical calculations with them.
Get an Unbiased Assessment
The best way to assess Excel skills is by using Excel skills assessment tests. These tests will give you an unbiased ranking of the top candidates. You can request a demo from a top talent assessment platform, which can assess a candidate's experience with various formulas and tools in Microsoft Excel.
0 notes
makemonewithchatgpt · 1 year ago
Text
100 chatgpt command prompts for mastering in Microsoft excel
Elevate your Microsoft Excel mastery with the power of ChatGPT command prompts. Discover the best ChatGPT prompts tailored for optimizing your Microsoft Excel experience. Unleash the potential of this dynamic combination to streamline tasks and achieve unparalleled efficiency in your spreadsheet workflows.
Whether you’re a novice or seasoned user, harness the capabilities of ChatGPT to enhance your Excel skills and achieve exceptional results.
1.”Generate a formula to sum the values in column A.”
2.”Create a VLOOKUP formula to retrieve data from another sheet.”
3.”Explain the difference between SUM and SUMIF functions in Excel.”
4.”Generate a bar chart for the data in cells A1 to B10.”
5.”How do I merge cells in Excel and center the text?”
6.”Write a formula to calculate the average of a range of cells.”
7.”Create a macro to automate a repetitive task in Excel.”
8.”How can I protect a specific range of cells with a password?”
9.”Sort data in descending order based on values in column C.”
10″Generate a pivot table summarizing sales data by month.”
11.”What is the purpose of the IFERROR function in Excel?”12.”How do I freeze panes to keep row and column headers visible?”
Code: yFVOQjwb 13.”Create a conditional formatting rule for cells containing errors.”
14.”Generate a random sample of data using the RAND function.”
15.”Explain the steps to create a drop-down list in Excel.”
16.”Write a formula to concatenate text in cells A1 and B1.”
17.”How can I find and replace specific text in a worksheet?”
18.”Create a line chart to visualize the trend in sales data.”
19.”What is the purpose of the INDEX and MATCH functions in Excel?”
20.”How do I transpose data from rows to columns in Excel?”
100 chatgpt command prompts for mastering in Microsoft excel
0 notes
theaccessibilityguy · 1 year ago
Text
0 notes
shawnjordison · 1 year ago
Text
Solving the "Table Header Cell Has No Associated Sub Cells" Error in Adobe Acrobat Pro
Learn to solve the 'Table Header Cell Has No Associated Sub Cells' error for table accessibility in Adobe Acrobat. #TableAccessibility, #AdobeAcrobat, #PDFAccessibility, #AccessibilityMatters, #InclusiveDesign, #DocumentCompliance, #ScreenReaderFriendly
Today we’re tackling a common issue in PDF accessibility: the “Table Header Cell Has No Associated Sub Cells” error. This can be a stumbling block when ensuring your documents achieve PDF/UA compliance. Here’s our step-by-step guide to help you resolve this. Video Guide In Adobe Acrobat, tables must have properly associated header and data cells for screen readers to interpret them correctly.…
Tumblr media
View On WordPress
0 notes
willowlark369 · 5 years ago
Text
Design Tips
I’ve taken several classes on spreadsheets as well as some classes on accessible designs. I have also been a part of several competition forums that use spreadsheets (specifically Google Sheets) to keep track of and coordinate information. After much debating, I decided to share some of the tips that I have discovered for making nice spreadsheets that are also accessible.
Check under the Read More line for G-Sheet specific tips.
Basic Formatting Considerations:
Curly fonts are pretty but difficult to read for everyone. Always choose a simple font with minimal serifs and twists for maximum readability. Hate to say it, but the best fonts for readability is typically the ones used for defaults anymore. (Ariel, Calibri, Times New Roman, Helavica, etc.) Fonts like Comic Sans, Bradley Hand, and Ink Free are also very reader-friendly.
Size 10 font is the smallest you should go and Size 12 is the largest for basic information (not things like headers/labels). Beyond this range, the fonts start to distort when the magnification is adjusted (either through changing to mobile or for people who up the magnification to prevent eye fatigue).
White backgrounds are evil. Switching to a color that’s even a bit tinted (such as the lightest gray available) reduces eye fatigue. Switching to a different hue entirely is even better.
Speaking of evil colors: neon colors are the devil. Never use them, especially together. Not only are they exhausting to the eye, they are common triggers for things like migraines, seizure disorders, and executive dysfunction. The brain just doesn’t like them.
The basic default color sets make it super simple to choose colors that look nice together without putting a lot of effort into creating colors. Staying in one column or three adjacent ones will give you a theme without effort.
Test your colors at different lighting saturation as well as angles of your screen (if working on a laptop). Just because a combo looks nice at one saturation and/or angle, does not ensure that it is readable at others.
If the font color isn’t black, then it keeps its color when turned into a hyperlink. This will avoid the eye-searing blue that is the default link color but causes major eye fatigue and makes links clash with other colors. It doesn’t even have to be far off of black.
Alternating Colors is excellent for long lists. They help keep track of lines. Be aware that they will automatically continue into other rows & columns if you do not either fill the space with something (coding or such) or set the columns next to the space to a different Alternating Color set. (This function is only available on G-Sheets).
Conditional formatting is best when done one range at a time. It is tempting to do multiple ranges at the same time when they’re going to be formatted the same way, but this increases the likelihood of the function not loading properly and therefore appearing broken when they aren’t.
Coding Basics:
Excel formulae & functions work the exact same in G-Sheets. So anything that talks you through problems with Excel formulae & functions will help you do the same for G-Sheets.
Do not set your number type to a currency unless you are actually needing that currency. A currency may not translate perfectly to regular numbers and it’s just better to be safe than sorry.
You can pull numbers from calculations done on different tabs. You can also do calculations for ranges on different tabs. This allows you keep a specific tab simple for readability. Basic code for this is ‘Tab Name’!Cell Address.
Use $ to lock a cell address even through dragging/quick-fill. It goes before the part you are wanting to lock (column or row). If you want to lock both parts, you will need to use it on both parts.
If the same information is being created more than once, it is redundant. Kill the spare.
If the same information is being inputted more than once, you will forget to update all places, regardless of what you tell yourself. Kill the spare.
If it is information that needs updating, it will be far easier to create a table where you just input the information from the next update period and the total is calculated for you.
If the computer can do the calculation for you, that should always be who does it regardless of how easy the math is. It is far easier to check input errors than it is to check computation errors.
If you are needing just an either/or response, consider inserting Checkboxes to the range. They are extremely user friendly and will keep users from giving responses that don’t match each other. Just remember to validate the range so that it’s usable for coding.
Keep the input spaces simple when they are going to be counted or the basis for a conditional format. Users will be creative in their responses if you give them the leeway to do so.
If you are counting an inputted text that is longer than four letters, be prepared for typos. Best insurance against typos is to use a wild card character to account for it in the counting formulae. Assume that typos will be missed on multiple proofreading.
So that’s all I can think of right now. Have fun!
15 notes · View notes
codeavailfan · 5 years ago
Text
SAS vs Excel Which One Is Better For Data Analytics
SAS vs Excel
In this new age of technology, new inventions that are all made by scientists or do business easily are changing every day. SAS vs EXCEL is one of the most common problems that everyone who uses statistics generates.
EXCEL is one of the ways in which software can display data, and likewise, SAS is part of the statistics used to display the data.
Today, innovations are introduced into all the devices, technologies, and new software that are developed every day, making it easier or most important, to facilitate our lives. In this blog, you will discuss or know about the most important and most common tool for data analysis, SAS VS EXCEL.
In the data analysis area, data analysis operations cannot follow a specific tool. Data or actions can be a Stick in Excel in Sas, depending on your needs and needs. Let's look at EXCEL VS SAS.
Both are part of statistics, and analysis in statistical data is the central part of the statistics. First of all, it's about data analysis.
Definition of Data Analytics?
Data analysis is a study of raw analysis to draw conclusions about such data. Numerous information systems and procedures have been performed in mechanical procedures and coding that act on raw data for human use.
Data analysis methods can detect patterns and measurements that will be lost in large amounts of data. Then you can use this data in advanced procedures to disseminate the company's overall productivity or framework.
Data analytics projects enable organizations to increase revenue, improve operational expertise, improve promotional efforts and support customers, respond more quickly to business model development, and gain an edge over their competitors.
Depending on the specific application, the information you are performing may include new information that is prepared for actual records or ongoing investigations. It also mixes internal frameworks with external sources of information.
 Definition of SAS?
SAS is the statistical software used primarily for information consulting, testing, and business knowledge. SAS is a statistical analysis system, and this language is encoded in the C. SAS language used in most working frameworks.
It can be used as a programming language and graphical interface. Developed by Anthony James Barr, you can analyze data from spreadsheets and databases. The results can be provided as tables, charts, and records. SAS uses it to view, retrieve, and test real data, and also to run SQL queries.
The reason for the existence of a statistical analysis system is to use information from many sources. It is used to collect data from other sources and perform real-world types of tests to obtain real results.
Predictors can use programming to view real-world tests, but they can also use the SAS programming language.
 Definition of Excel?
Excel is a microsoft-manufactured program that is a section of the Microsoft Office Efficiency Programming Suite created by Microsoft. It was originally coded as an Odyssey during its development at Microsoft and was first released from 1985 to 1985.
Ideal for creating and editing spreadsheets stored as .xls or .xlsx document extensions. General Excel uses cell-based predictions, rotating tables, and other graphics.
For example, you can use spreadsheets in Excel to create monthly spendplans, track operating costs, or sort and compile a lot of information.
Unlike word processors such as Microsoft Word, Excel reports contain sections of individual cells and columns of information. Each of these cells may contain messages or numerical characteristics that can be determined using the recipe.
Applications of SAS VS EXCEL
Multivariate analysis
Think of someone who wants to buy stocks in bulk. Then a person has the value of other factors such as value, quantity, quality, etc. The same applies to exams that do not have any changes. Identify and study multiple measurable result factors at the same time.
It uses multiple surveys to assess the impact of various factors on individual factors and includes factor survey studies, adverse volume tests, and numerous recurrences.
Business Intelligence
This leads to the technology and policies applied by the company to analyze business data. Provides penetration in relation to the current solid job prospects.
Reviewing the information will help you on a higher board with dynamic differences. These steps include information retrieval, information extraction, process extraction, complex opportunity preparation, and comparative analysis.
 Predictive analytics
As the name suggests, it actually uses data that can be used in the future. To draw clothes, use some measurable methods. For example, an organization's article negotiation pattern has been consistent for many years. Therefore, we recommend articles that do not change. However, article B, which changes interests every month, examines all variables that cause diversity and hide the performance of the content, the customer perspective, and so on. Here, the estimated model searches for designs found in the actual information to identify cases.
Advantages of Excel vs sas
Easy and effective comparisons
Excel is a powerful data analysis tool that explores a large amount of data to detect the courses and patterns that will influence the decision. It allows you to collect data from a large amount of data in a simple way.
Work together
 By using Excel, you can work on spreadsheets with another user at the same time. The ability to work together increases the ability to optimize good ways and enables brainstorming. The main advantage is that your Excel worksheet is Web-based, and you can collaborate anywhere.
Large amounts of data
 Recently, updates to Ms. Excel's version improve your experience of testing large amounts of data. With powerful filtering, sorting, and search tools, you can quickly and quickly narrow down the criteria to help you make decisions.
Microsoft Excel Mobile and iPad apps
  By developing an App for Android and iOS, it's easy to bring spreadsheets to a customer or business meeting without having to carry your laptop. The app makes it easy to work anywhere with your smartphone. you can edit or make changes to your phones and tablets immediately.
SAS is the simplest programming language so that anyone can easily learn SAS
This is the programming language then for encoding manages large databases.
while debugging encoding is very simple. error messages are more understandable.
The algorithm is first tested and then added or implemented to SAS.
SAS has dedicated customer support
The main advantage is that it provides full security to your data.
Features of SAS VS EXCEL
SAS
Data encryption algorithm
Management
SAS studio
Report output Format
Strong Data Analytics Abilities
Support of Various Types Of Data Format
Excel
Data Filtering
Find and replace command
Data sorting
Automatically edits the results
Password protection
Option for header and footer
Conclusion
By this blog you can easily choose the software or programming language between both SAS VS EXCEL. By this blog you will learn the advantages, definition and some features and applications of  the EXCEL VS SAS which is provided by the experts which are always there to help you in every problem. Similarly make use of our best services and increase your knowledge with us.
As a result, if you want Statistics Assignment Help and Statistics Homework Help or Excel assignment help. Our experts are available to provide you SAS Assignment Help and Do my Statistics Assignment  within a given deadline.
1 note · View note
leftcreatorsheep · 2 years ago
Text
Linear Regression
We want to perform linear regression of the police confidence score against sex, which is a binary categorical variable with two possible values (which we can see are 1= Male and 2= Female if we check the Values cell in the sex row in Variable View). However, before we begin our linear regression, we need to recode the values of Male and Female. Why must we do this?
The codes 1 and 2 are assigned to each gender simply to represent which distinct place each category occupies in the variable sex. However, linear regression assumes that the numerical amounts in all independent, or explanatory, variables are meaningful data points. So, if we were to enter the variable sex into a linear regression model, the coded values of the two gender categories would be interpreted as the numerical values of each category. This would provide us with results that would not make sense, because for example, the sex Female does not have a value of 2.
We can avoid this error in analysis by creating dummy variables.
Dummy Variables
A dummy variable is a variable created to assign numerical value to levels of categorical variables. Each dummy variable represents one category of the explanatory variable and is coded with 1 if the case falls in that category and with 0 if not. For example, in the dummy variable for Female, all cases in which the respondent is female are coded as 1 and all other cases, in which the respondent is Male, are coded as 0. This allows us to enter in the sex values as numerical. (Remember, these numbers are just indicators.)
Because our sex variable only has two categories, turning it into a dummy variable is as simple as recoding the values of Male and Female from 1=Male and 2=Female to 0=Male and 1=Female. (We will see later that creating dummy variables for categorical variables with multiple levels takes just a little more work.) However, it’s good practice to create a new variable altogether when you are creating dummy variables. This way, if you make an error while building the dummy variables, you haven’t altered your original variable and can always start again.
To begin, select Transform and Recode into Different Variables.
Find our variable sex in the variable list on the left and move it to the Numeric Variable -> Output Variable text box.
Next, under the Output Variable header on the left, enter in the name and label for the new sex variable we’re creating. We’ve chosen to call this new variable sex1 and label it Sex Dummy Variable.
Click Change, to move your new output variable into the Numeric Variable -> Output Variable text box in the centre of the dialogue box.
Then, select Old and New Values.
Enter 1 under the Old Value header and 0 under the New Value header. Click Add. You should see 1 -> 0 in the Old -> New text box. Now enter 2 under the Old Value header and 1 under the New Value header. Click Add, and then Continue.
Finally, click OK in the original Recode into Different Variables dialogue box. Scroll down to the very end of the variables list in Variable View. You should see your new dummy variable sex1 at the end of the list, as it’s the last variable to be created.
Now, let’s run our first linear regression, exploring the relationship between policeconf1 and sex1.
To perform simple linear regression, select Analyze, Regression, and Linear…
Find policeconf1 in the variable list on the left and move it to the Dependent box at the top of the dialogue box. Find sex1 in the variable list and move it to the Independent(s) box in the centre of the dialogue box. Click OK.
Your output should look like that output tables on the right.
For our purposes, you don’t need to be concerned with most of the results above – you can learn more about these as you become more experienced. However, from some of these, we can work out the effect of sex on confidence in the police. We can use our SPSS results to write out the fitted regression equation for this model and use it to predict values of police confidence for given certain values of sex. Using the output from SPSS, we can calculate the mean confidence in the police for men and women using the following regression equation: Y = a + bX where Y is equal to our dependent variable and X is equal to our independent variable. Into this equation, we will substitute a and b with the statistics provided in the Coefficients output table, a being the constant coefficient and b being the coefficient associated with sex (our explanatory variable). In this example, our equation should look like this: policeconf1 = 13.761 + (-0.436 x sex) Since sex takes on the value of 1 for female and 0 for male, the predicted scores are as follows: policeconf1 = 13.761 + (-0.436 x 1) = 13.325 (Females) policeconf1 = 13.761 + (-0.436 x 0) = 13.761 (Males) So, on average, female respondents reported a police confidence score that is .436 points lower than male respondents. Think about how policeconf1 is measured. In this variable, what does a lower score mean? Also, if you’ve been following all the analyses we’ve done in this section, these scores may look familiar to you. Consider where you might have seen these scores before. Rather than just accepting these results, we now want to gauge how much of the variation in policeconf1 is explained by sex1. To do this we can simply use the r2 statistic which you will find is already calculated for you in the Model summary output table above. In this example, the r2 is very low at 0.003. This shows that only 0.3% of the variation in police confidence is explained by sex (0.003 x 100 to give us a percentage). This suggests that there are many other factors that might be affecting a respondent’s confidence in the police.
The linear regression model above allowed us to calculate the mean police confidence scores for men and women in our dataset. We can check to see if our calculated mean scores are correct by using the Compare Means function of SPSS (Analyze, Compare Means, Means, with policeconf1 as the Dependent variable and sex as the Independent variable).
What are the results of your mean comparison? They should be exactly the same as the means we calculated above. Calculating the mean scores using simple linear regression, with just one independent variable, was effectively the same function as comparing the means. As we’ll see later, multiple linear regression allows the means of many variables to be considered and compared at the same time, while reporting on the significance of the differences.
Determining the Significance of the Independent Variable
What is the significance of sex as a predictor of police confidence score?
Our sample of data has shown us that, on average, female respondents reported a police confidence score that is .436 points lower than male respondents. We want to know if this is a statistically significant effect in the population from which the sample was taken. To do this, we carry out a hypothesis test to determine whether or not b (the coefficient for females) is different from zero in the population. If the coefficient could be zero, then there is no statistically significant difference between males and females.
SPSS calculates a t statistic and a corresponding p-value for each of the coefficients in the model. These can be seen in the Coefficients output table. A t statistic is a measure of how likely it is that the coefficient is not equal to zero. It is calculated by dividing the coefficient by the standard error. If the standard error is small relative to the coefficient (making the t statistic relatively large), the coefficient is likely to differ from zero in the population.
The p-value is in the column labelled Sig. As in all hypothesis tests, if the p-value is less than 0.05, then the variable is significant at the 5% level. That is, we would have evidence to reject the null and conclude that b is different from zero. In this example, t = -10.417 with a corresponding p-value of 0.000. This means that the chances of the difference between males and females that we have calculated is actually happening due to chance is very small indeed. Therefore, we have evidence to conclude that sex1 is a significant predictor of policeconf1 in the population.
Summary
You’ve just used linear regression to study the relationship between our continuous dependent variable policeconf1 and sex, a categorical independent variable with just two categories. Using linear regression, you were able to predict police confidence scores for men and women. What if you wanted to fit a linear regression model using police confidence score and something like ethnicity, a categorical independent variable with more than two categories? The next page will take you through how to run a simple linear regression with a categorical independent variable with several categories.
Note: as we are making changes to a dataset we’ll continue using for the rest of this section, please make sure to save your changes before you close down SPSS. This will save you having to repeat sections you’ve already completed.
Tumblr media
0 notes
jeevanarao · 6 years ago
Text
RULES FILE
Being a human, we use to judge a book by its cover. Soooo I will first give you an overview of what my post consists of since, I don’t want you to go wrong.
 What is Rules File?
 Where do we use it?
 Creating Rules File.
 Using Rules File.
 Multiple options available in it.
First, we will try to understand what rules file is and where we use them.
WHAT IS RULES FILE…...??
Rule file is a set of rules/conditions applied to source data set that can be loaded in Essbase (Either Metadata or Data). We can consider this as an ETL(http://hyperionatease.blogspot.com) for Hyperion Essbase, as it will take the source data and apply some transformations (Conditions/rules) to load into Essbase.
Rules files can be created by using File system/Data tables (As Source) to Build Metadata and Load Data in Essbase.
Tumblr media
WHEN DO WE USE IT….??
Rules File is mainly used for two things.
Dimension Build.
Data load
We use different rules files for different uses below follows
     Dimension Build: Dimension build rules files is used to build dimensions. Intern it will be used in the automation of Dimension build.
     Data Load: Data load rules files is created to load the data.
CREATING RULES FILES:
1.To create Rules files
    a.In your console select your Application ->  Database -> Rules File(Right               click on rules file)   ->  Create rules file.
Tumblr media
   b.Open file menu editors data prep editor (connect to the server and select         application where you want to create.)
Tumblr media
2.Below is the Data prep editor that will be displayed.
3.Next go to files -> Open data file.
4.Select File Location (File System/Essbase Server), select the data file.
Tumblr media
5.For what we are building this Rules file??? Is this used for dimension building or data loading??
If you want to build the dimensions then the flat file looks as in Fig 5.a.
Else, for data loading the flat file looks like Fig 5.b.
Tumblr media
                                                  Fig 5.a
Tumblr media
                                                 Fig 5.b
6.Once Selected the Data file, click OK. Data File will be displayed on the screen.
Tumblr media
7. All the data was clumsy right?? So to avoid the confusion for us and system as well we will separate them into different columns.
What should we do to separate them…..?? Ok, let me tell you.
To separate the data in to separate field column. Select the Data source properties tab (forth last in the property bar). A dialog box will appear and in that select the Delimiter, And Click OK.
DATA SOURCE PROPERTIES:
This settings are applied on the source data which moulds the file according to the system.
DELIMITER:
This works like any other delimiter that splits a file or data source.
Here we can specify any delimiter using “Custom” option.
FIELD EDITS:
Lists all edits that have been made to this load rule (held in order of changes made).
HEADER:
You can use this option to skip the lines in the data source (usually the header record) or if your data source is set up properly; you can use it as the data load field names or dimension build field name.
IGNORE TOKENS:
Here you can add certain tokens that are to be ignored during the data load.
Tumblr media
Based on the delimiter you used in the data file select the delimiter here we used comma to separate them since, delimiter used in flat file is “Comma”.
Next in the field panel data will be separated by commas and fields are divided into parts.
8. We may sometimes have the headers in the top of the data file. If we load the file then we will get an error for loading the field types.In that case we have an option to skip them DATA SOURCE PROPERTIES -> HEADER -> Number of lines to skip.Enter the number of lines to be skipped during the load. 
Tumblr media
RULES FILE FOR DIMENSION BUILDING
9.Now go to Dimension Build Setting, A dialog box will appear in that click on Dimension Build Setting Tab (Since, we are creating the rules file for building the dimensions). Now Select the Dimension, and Select the Build Method as required. And Click OK.
DIMENSION BUILD SETTINGS:
Here you will define which Essbase dimensions you are building with this rules file.
GLOBAL SETTINGS:
Here we can set the default alias table.
We can select the option to use dimension property settings for data configuration or we can select to Auto configure dense/sparse members.
 We have other option to arrange the dimension members in hour glass shape.
 We can define the properties of existing members and attribute members.
BUILD METHODS:
For the given column that you are in, you will first select a dimension, then you will select its association to the outline.
For instance, you can build your dimension via Generation References, Level References or Parent-Child references.
Generations are a top-down approach, where you database name is Generation 0, the dimension names are Generation 1 and so on.
Level References are just the opposite, starting from the leaf level (or lowest possible level) and working up from 0 to the database name. Each has its pros and cons based on your data because you may not always have the same amount of generations or levels, so it might be tricky to build your rules file.
The recommended approach would be to utilize the Parent-child reference, because no matter how many generations or levels you have, you will always have a Parent-Child relationship.
MEMBER SORTING:
The members in the outline can be arranged in the ascending or descending order.
NOTE: This settings cannot be changed if it is selected once.  
MEMBER UPDATE:
A member can be updated by merging or replacing it .
DIMENSION DEFINITION TAB:
 In this settings we can define a new dimension in the rules file itself.
Tumblr media Tumblr media
10.Next we need to set the FIELD PROPERTIES -> DIMENSION BUILD PROPERTIES.
FIELD PROPERTIES:
Field Properties portion of an Essbase Load Rule, there are three tabs (Global Properties, Data Load Properties, and Dimension Build Properties).
Global Properties affect both Data Load and Dimension     Build rules
Data Load Properties and Dimension Build Properties are     specific for each.
GLOBAL PROPERTIES:
 This section deals with whether the data is to be applied in its Original Case, Lowercase or Uppercase.
 Original is the default.
You can add a Prefix (leading text) or a Suffix (ending text).
You can Drop lead/trailing spaces.
Keep in mind, adding a prefix/suffix with impact all members in a given dimension, not just one specific member.
Instead of changing all members within a given dimension with a prefix/suffix, you can specify one member to be modified with the ‘Replace/with’ section.
DATA LOAD PROPERTIES:
In this tab we will define the field for data load.
We can ignore the data field while loading.
 DIMENSION BUILD PROPERTIES:
 This is where you specify whether or not a given dimension is to use Generation References, Level References or Parent/Child references.
 You can also check the box to process null values (this is really applicable when building your dimension using Generation references).
Tumblr media
Select the Dimension for which we are building the Meta data and Field type.
Tumblr media
11.Now Click on Dimension Build Fields. Then you can find that Field name has been changed to your respective field type.
Tumblr media
12.Validate the file, then you can see a window that shows status of success/Failure. 
Tumblr media
13. Then Save the rule file. After saving it you can notice that a new field is created under your database name as Rules Files. Under the Rule files you will find all your rules files.
Tumblr media
Now, our rules file is ready for building the dimension.
RULES FILE FOR DATA LOADING
We have seen how to create a rules file for dimension building. Now we will see to create the rules file for data loading.
First 8 steps in creating a data load rules file is similar to creating rules file for dimension building.
9. After that we need to select the DATA LOAD SETTINGS instead of DIMENSION BUILD SETTINGS.
DATA LOAD SETTINGS:
This settings will reflect on the data getting loaded into the system.
DATA LOAD VALUES:
DATA VALUES:
While loading the data we have multiple options to load:
Overwrite Existing Values
Add Data value to Existing data
Subtracting data from Existing data
Data load with Sign Flip
Dealing with the rejected records.
OVERWRITE EXISTING VALUES:
We use this option for overwriting the values of the existing hierarchy.
ADD DATA VALUE TO EXISTING DATA:
If we enable this option the data we entering will be aggregated to the existing data and finally displays the aggregated value.
SUBTRACTING DATA FROM EXISTING DATA:
If we use this option the data will be subtracted from the existing data and the ultimate result will be the difference between the existing and new values.
 DATA LOAD WITH SIGN FLIP:
We can understand the functionality of this option by its name itself i.e. flipping of the sign of the data from “+” to “–“and vice versa.
GLOBAL SELECT/REJECT BOOLEAN:
Based on the selected option “AND”,”OR” the boolean value of select/reject conditions works during the data load.   
CLEAR DATA COMBINATIONS:
Here we will define the combination in which data is to be cleared while loading.
HEADER DEFINITION:
To reach a specific cell in the block we need to have the combination of all dimension members. So, here we can define the combination which is not specified in the data file.
        Select DATA LOAD VALUES tab in the ribbon of DATA LOAD SETTINGS
Tumblr media
10. Select the FIELD PROPERTIES -> FIELD DEFINITION -> FIELD NAME
      Field name will define the categories of data present in the specified field.
Tumblr media
Similarly we need to define for all the fields by clicking on NEXT.
Tumblr media
If you want to ignore a specific field we have an option “Ignore field during data load”.
Tumblr media
11.While loading a cube it is mandatory to use all the combinations of dimensions to reach to a  specific cell of the block.So, we have an option to define the combinations that are not defined in our data file i.e.,  
DATA LOAD SETTINGS -> HEADER DEFINITION.
Tumblr media
12. Now save the rules file and validate it. Then you see a tab displayed as below.
Tumblr media
Next we are ready with the rules files and need to load the data file for getting the Dimension members (meta data) in the outline/ Data in the block.
1.Right click on the Database -> Select Load Data option
Tumblr media
2. Here we have 3 modes of loading
Load: Used for loading the data.
Build Only: Used for building the metadata.
Both: Used to Build and load the data at a time.
Tumblr media
3. Select the data file and rules file.
Tumblr media Tumblr media
  4. Now click on OK.
Tumblr media
5. Now you will get a window displaying the status of your Meta data/Data load.
NOTE:
           A single Rules File can be used multiple times for loading data or building dimensions multiple times.
It is best to create a separate rules file for each dimension.
    Rules files have a .rul extension and can be edited with Essbase-Data Prep Editor.
Now we are done with creating the rules file and loading the data.
I was happy that my job was done but I got a sudden surprise.
That is nothing but a WARNINGGGGG…….
Warning:
Tumblr media
When everything is correct why do I get an error?
I just banged my head for finding the reason. At last I found it.
Do you remember while creating the rules file we used an option called “IGNORE FIELD DURING DATA LOAD”?
That is the monster here, if we select the field name while using this option it will through the warning.
We must not select any field name while using the “IGNORE FIELD DURING DATA LOAD”.
Tumblr media
4 notes · View notes
angking · 3 years ago
Text
Common Mat Table
table { width: 100%; }
.textWrap { word-break: break-word; }
.table-cell-padding { padding: 5px; }
html
table mat-table id="mt" [dataSource]="mtDataSource" matSort class="table table-striped table-hover force-style-gcweb-4-0-29" [attr.aria-label]="ariaLabel"
ng-container *ngFor="let col of columnDefinitionList" [matColumnDef]="col.matColumnDef" th mat-header-cell *matHeaderCellDef mat-sort-header [ngClass]="col.cssClass" span translate{{col.headerText}}/span /th td mat-cell *matCellDef="let element" [ngClass]="col.cssClass"{{element[col.matColumnDef]}}/td /ng-container tr mat-header-row *matHeaderRowDef="columnsToDisplay"/tr tr tabindex="0" mat-row *matRowDef="let row; columns: columnsToDisplay" (click)="onDataRowClick(row)"/tr tr class="mat-row" *cdkNoDataRow td class="mat-cell" colspan="9999"{{noRecordsFoundMessage}}/td /tr /table
ts
import { CcColumnDef } from './../../model/common-components/cc-column-def'; import {Component, EventEmitter, Input, OnInit, Output, ViewChild } from '@angular/core'; import { MatSort } from '@angular/material/sort'; import { MatTableDataSource} from '@angular/material/table';
@Component({ selector: 'app-table', templateUrl: './table.component.html', styleUrls: [''] }) export class TableComponent implements OnInit, AfterViewInit {
mtDataSource = new MatTableDataSourceany(); @ViewChild(MatSort) matSort: MatSort;
private _dataSource: Arrayany = []; get dataSource():Arrayany{ return this._dataSource; } @Input() set dataSource(value: Arrayany){ this._dataSource=value; this.mtDataSource.data=this.dataSource; }
private _columnsToDisplay: string[]; get columnsToDisplay():string[]{ return this._columnsToDisplay; } @Input() set columnsToDisplay(value:string[]){ this._columnsToDisplay=value; }
private _columnDefinitionList: ArrayCcColumnDef; get columnDefinitionList():ArrayCcColumnDef{ return this._columnDefinitionList; } @Input() set columnDefinitionList(value:ArrayCcColumnDef){ this._columnDefinitionList=value; }
@Input() ariaLabel:string; @Input() noRecordsFoundMessage:string;
@Output() tableRowClick: EventEmitterany = new EventEmitterany()
constructor() { } ngAfterViewInit(): void { this.mtDataSource.sort=this.matSort; }
ngOnInit(): void { }
onDataRowClick(data:Event){ this.tableRowClick.next(data); } }
Client
div *ngIf=" caseDetail && caseDetail.phase && participantSessionStorage && filteredCaseTypes && filteredCaseTypes.length 0 " class="container"
h1 tabindex="0" {{ participantSessionStorage?.participantName }} - {{ participantSessionStorage?.displayedCaseId }} /h1 div class="panel panel-default force-style-gcweb-4-0-29" header class="panel-heading" h6 class="panel-title" div *ngIf="caseDetail" span tabindex="0" {{ "phase" | translate }}: {{ caseDetail.phase | titlecase }}/span span tabindex="0" *ngIf="caseDetail.rss" class="pull-right" {{ "assignedRss" | translate }}: {{ caseDetail.rss.firstName }} {{ caseDetail.rss.lastName }}/span /div /h6 /header div class="panel-body" div class="wb-frmvld mrgn-tp-md" form-error [validateForm]="casePageForm" [appFormErrors]="appFormErrors" /form-error form #casePageForm="ngForm" class="form-inline" div class="form-group input-group col-md-9 mrgn-bttm-md" label tabindex="0" for="caseType" class="required" span class="field-name"{{ "caseType" | translate }}/span strong class="required" ({{ "required" | translate }})/strong /label select ngModel #caseType="ngModel" class="form-control" id="caseType" name="caseType" autocomplete="honorific-prefix" required="required" (change)="onCaseTypeSelection($event)" option value=""Select/option option class="case-options" *ngFor="let caseType of filteredCaseTypes" [value]="getCaseType(caseType)" [selected]="selectedCase.displayText === caseType.displayText" {{ caseType.displayText }} /option /select /div div class="table-responsive table table-striped table-hover force-style-gcweb-4-0-29" ng-container *ngIf="selectedCase?.value !== ''" app-simple-table [columnDefinitionList]="columnDefinitions" [columnsToDisplay]="displayedColumns" [dataSource]="mappedDS" [ariaLabel]="ariaLabelInfo" [noRecordsFoundMessage]="noRecordsFoundMessage" (tableRowClick)="onTableRowClick($event)" /app-simple-table /ng-container !-- Need to remove this if condition once the pagination is done for all case types (BE)-- app-pagination *ngIf="selectedCase?.value === 'caseNotes'|| selectedCase?.value === 'assessments' || selectedCase?.value === 'serviceRequests'" [index]="page" [data]="mappedDS" [pageSize]="pageSize" [totalCount]="totalCount" [rulerSize]="ruleSize" (pageChange)="paginate($event)" /app-pagination /div div *ngIf="selectedCase?.value === CASE_NOTES" class="input-group col-md-12 caseNotesTextSection" label for="caseNotesText" class="required" span class="field-name" {{ "addNote" | translate }} /span strong class="required"(required)/strong/label textarea title="{{ 'addCaseNoteHere' | translate }}" id="caseNotesText" name="caseNotesText" (input)="onCaseNotesInput($event)" #caseNotesText="ngModel" [(ngModel)]="caseNotes" required [ng-maxlength]="maxCaseNoteLength" [maxlength]="maxCaseNoteLength + 1" placeholder="{{ 'addCaseNoteHere' | translate }}" class="form-control case-note-text-area" [class]="{ 'case-note-text-area-error': !caseNotesText.valid && caseNotesText.dirty }" /textarea /div div span{{ caseNotes ? caseNotes.length + "/" + maxCaseNoteLength : "" }}/span /div /form /div /div footer *ngIf="selectedCase?.value === CASE_NOTES" class="panel-footer" a href="#" (click)="openAddNoteConfirmation($event)" class="wb-lbx btn btn-primary" {{ "addNote" | translate }}/a /footer
/div /div
ts
import { Sort } from '@angular/material/sort'; import { AppFormError } from './../../common-components/app-form-error/model/app-form-error'; import { ModalDialogComponent, ModalActionButton, ModalActionTypeEnum, ModalAlertCssEnum } from 'src/app/common-components' import { ActivatedRoute, Router } from '@angular/router'; import { Component, OnDestroy, OnInit, ViewChild } from "@angular/core";
// Note : Do not change the order of the contents of JSON file "CaseTypesJson" below import CaseTypesJson from "../../../assets/dropdowns/case-types.json"; import { MatDialog } from "@angular/material/dialog"; import { ModalDialogConfig } from "src/app/common-components/modal-dialog/modal-dialog-config"; import { Subject, takeUntil } from "rxjs"; import { NgForm, FormGroup } from "@angular/forms"; import { CaseTypesRequest, CaseType, CcColumnDef, } from "src/app/model"; @Component({ selector: "app-case", templateUrl: "./case.component.html", styleUrls: [ "./case.component.css", ], })
export class CaseComponent implements OnInit, OnDestroy { noRecordsFoundMessage = ""; ariaLabelInfo: string; CASE_NOTES = CaseTypesJson[11].value; // Case notes dropdown value maxCaseNoteLength = AppSettings.CASE_NOTE_MAX_LENGTH; caseNotes = ""; caseDetail: CaseDetail; //breadcrumbs displayCaseId: string = ""; participantSessionStorage: ParticipantInfoSessionStorage; // Data for "Drop down" originalCaseTypes: Array = CaseTypesJson; filteredCaseTypes: Array = []; selectedCase: CaseType = ModelUtil.getDefaultModelForCaseType(); columnDefinitions: Array = [];
currentSort: Sort; pageSize: number = AppSettings.DEFAULT_PAGE_SIZE; page = 1; pageIndex: number = 1; ruleSize: number = AppSettings.RULE_SIZE; totalCount: number = 0; paginationDetails = { PageNumber: AppSettings.DEFAULT_PAGE_NUMBER, PageSize: AppSettings.DEFAULT_PAGE_SIZE, };
//#region "Object Properties" displayedColumns: string[]; apiOriginalData: Array; mappedDS: Array = []; //#endregion "Object Properties"
//#region App form erros get appFormErrors(): AppFormError[] { return this.getErrorMessages(); } //#endregion
private readonly _destroying$ = new Subject();
@ViewChild("caseNotesText") caseNotesText: NgForm; @ViewChild("casePageForm") casePageForm: FormGroup;
private roles: string[] = [];
constructor( private router: Router, private route: ActivatedRoute, ) { }
ngOnInit(): void { this.subscribeToLanguageChange(); this.setUpCaseTypes();// get participant session storage this.participantSessionStorage = JSON.parse( this.sessionStorageSrv.getItem( SessionStorageKeys.PARTICIPANT_INFO_SESSION_STORAGE ) ); if (this.participantSessionStorage) { //setting a value for breadcrumbs label this.displayCaseId = this.participantSessionStorage.displayedCaseId; this.breadcrumbService.set("@displayCaseId", this.displayCaseId); // get case detail this.caseService .getCaseDetail(this.participantSessionStorage.caseId) .subscribe({ next: (res) => { if (res) { this.caseDetail = res; } }, error: (error: Error) => { //TODO : Error Handling console.log("test", error); }, }); }
}
ngOnDestroy(): void { this._destroying$.next(undefined); this._destroying$.complete(); }
onCaseTypeSelection(event: Event) { let selectedStringValue = (event.target as HTMLSelectElement).value;if (selectedStringValue === "") { this.resetCaseTypeControl(); } else { this.selectedCase = JSON.parse(selectedStringValue); this.caseNotes = ""; this.loadDataWithSelectedCaseType(); }
}
getCaseType(type: CaseType) { return JSON.stringify(type); }
loadDataWithSelectedCaseType() { switch (this.selectedCase.value) { // "value": "claims" case CaseTypesJson[0].value: { this.setupModelsForClaims(); break; } // "value": "assessments" case CaseTypesJson[2].value: { this.setupModelsForAssessments(); break; } // "value": "rehabilitationPlans" case CaseTypesJson[3].value: { this.setupModelsForRehabilitationPlan(); break; } // "value": "vocationalTrainingPlans" case CaseTypesJson[4].value: { this.setupModelsForVocationalTrainingPlan(); break; } // "value": "progressUpdates" case CaseTypesJson[5].value: { this.setupModelsForPrgUpdates(); break; } // "value": "serviceRequests" case CaseTypesJson[7].value: { this.setupModelsForSrvRequests(); break; } // "value": "closureRequests" case CaseTypesJson[8].value: { this.setupModelsForClosureRequests(); break; } // "value": "authorizationTravelRequests" case CaseTypesJson[10].value: { this.setupModelsForAuthTrvRequests(); break; } // "value": "caseNotes" case CaseTypesJson[11].value: { this.setupModelsForCaseNotes(); break; } // "value": "participantExperienceFeedback" case CaseTypesJson[12].value: { this.setupModelsForPartExpFeedback(); break; } // "value": "referralToRSP" case CaseTypesJson[13].value: { this.setupModelsForReferralToRSP(); break; } // TODO : This would need some thought // when "Select" is selected default: { // this.resetCaseTypeControl(); break; } }
}
resetCaseTypeControl() { this.mappedDS = []; this.ariaLabelInfo = ""; this.noRecordsFoundMessage = ""; this.displayedColumns = []; this.selectedCase = ModelUtil.getDefaultModelForCaseType(); this.caseNotes = ""; }
setNoRecordsFoundMessage(mappedDS: Array) { if (mappedDS && mappedDS.length === 0) { this.noRecordsFoundMessage = this.translateFilter.transform("casePage.noDataRow"); } }
setAriaPropertyForCases(displayText: string) { if (displayText) { this.ariaLabelInfo = displayText; } }
setupModelsForAssessments() { this.columnDefinitions = CaseTypeAssessmentUtil.getColumnDefinitionListForAssessments(); if (this.columnDefinitions) { this.displayedColumns = this.columnDefinitions.map((c) => c.columnName); }if (this.displayedColumns && this.displayedColumns.length > 0) { this.mappedDS = []; // TODO: need to remove hardcoded case id, once we are ready to point to real CMS endpoint for assemments this.caseService .getCaseDetailByType( "8709a1d6-d3a6-41b7-615b-08da433495bd", "assessment", this.paginationDetails ) .subscribe({ next: (res: any) => { if (res && res.data && res.data.length > 0) { this.mappedDS = CaseTypeAssessmentUtil.transformDSForAssessment( res.data ); this.pageSize = res.pageSize; this.totalCount = res.totalRecords; this.paginate(this.pageIndex); } this.setNoRecordsFoundMessage(this.mappedDS); this.setAriaPropertyForCases(CaseTypesJson[2].displayText); }, error: (error: Error) => { //TODO : Error Handling console.log("test", error); }, }); }
}
util
import { DateTimeUtil } from 'src/app//utils/datetime-util'; import { CcColumnDef } from 'src/app/model'; export class CaseTypeAssessmentUtil{// TODO: need to remove this once BE integration is done static getData() { return [{ "requestNumber": 'RQ-001', "service": 'IVA', "referralDate": DateTimeUtil.isoToYYYY_MM_DD(new Date()), "reports": 'TBD'},{ "requestNumber": 'RQ-002', "service": 'IVA', "referralDate": DateTimeUtil.isoToYYYY_MM_DD(new Date()), "reports": 'TBD' }]; } static getColumnDefinitionListForAssessments(): Array<CcColumnDef>{ return [ { "cssClass": "widthFlex1", "matColumnDef": "requestNumber", "headerText": "caseTypeAssessmentTable.requestNumber", "columnName": "requestNumber" }, { "cssClass": "widthFlex1", "matColumnDef": "service", "headerText": "caseTypeAssessmentTable.service", "columnName": "service" }, { "cssClass": "widthFlex1", "matColumnDef": "referralDate", "headerText": "caseTypeAssessmentTable.referralDate", "columnName": "referralDate" }, { "cssClass": "widthFlex1", "matColumnDef": "reports", "headerText": "caseTypeAssessmentTable.reports", "columnName": "reports" } ]; } static transformDSForAssessment(apiOriginalData:Array<any>) : Array<any>{ let dataArray:Array<any>=[]; if(apiOriginalData && apiOriginalData.length>0){ dataArray = apiOriginalData.map((row) => { return { "requestNumber": row.requestNumber, "service": row.service, "referralDate": DateTimeUtil.isoToYYYY_MM_DD(new Date(row.referralDate)), "reports": row.reports, "assessmentId": row.assessmentId }; }); } return dataArray; }
}
int
export interface CcColumnDef { cssClass: string; matColumnDef:string; headerText?:string; columnName: string; }
export interface CaseType { displayText: string; value: string; enableOption?:boolean; }
0 notes