I do, Madam Chair. In fact, I've got a very brief presentation that I'll project here. There are eight slides that will basically provide an overview and a foundational understanding of the policy on results in the department of results frameworks. We'll take questions on that and then turn to a demo on InfoBase.
I'm also joined at the table by my colleagues from expenditure management sector, Chantal Clow and Andrew Gibson. They'll be aiding in the presentation today.
If I may, in the presentation in front of you, slide 2 simply provides some context in terms of the government's results agenda. There are really three pieces that serve as important background here. First is changes at the cabinet level, the creation by the of a new cabinet committee, known as the agenda, results, and communications committee. This brings a focus to results discussions that is a key part of the results agenda going forward.
Second, we have some machinery changes. In late 2015 or early 2016, the announced the creation of a new central unit in the Privy Council Office, known as the results and delivery unit, to provide central oversight of the results agenda and to support his cabinet committee.
Third, we have a new Treasury Board policy that came into effect on July 1, 2016. The policy on results provides a framework for a whole-of-government view of reporting and results, including articulating core responsibilities and key results expectations, identifying a program inventory that departments report on in part III of the estimates, and then making adjustments to ongoing performance measurement and evaluation of programs.
So that is a brief overview of what is behind our new policy. I will now highlight the key elements.
Slide 3 presents in visual form the key elements of the policy. A starting point is the articulation by departments of what we call departmental results frameworks. These are developed by departments, and they are presented by ministers to ministers. The responsible minister will come to the Treasury Board table identifying his or her departmental results framework, which includes the core responsibilities of the department, the key results that they expect to achieve in support of those core responsibilities, and then an explanation of the performance indicators that departments will assess throughout the year to determine whether they're on track.
The program inventories are the mass of programs within a department that support the core responsibilities. This can be anything from, using CRA as an example, a program aimed at audit of large tax filers. In our case, at the Treasury Board Secretariat, we have a program related to our responsibilities as the employer, so there's a program related to collective bargaining.
The performance information profiles are the information holdings of the department. The key metrics of the program in terms of the budget are the number of FTEs—again, some very important performance information—and then plans in terms of how performance will be assessed over the year.
Finally, a key part of the policy was a renewed focus on evaluation. Prior to 2016, there was a requirement for all government departments to evaluate all programming on a five-year basis, so big or small, all programs had to be evaluated on a five-year basis. We found this, in consultation with both departments and our colleagues in other central agencies, to be out of sync with the real needs of both departments and central agencies.
As an example, in a world in which it was mandatory to evaluate a program, the cycle would unfold and we would be in the fifth year, and so a department would have to evaluate program x or y. In some cases it could be a long-standing program that had been evaluated several times in the past, that was of relatively low dollar value, with no changes foreseen. A mandatory evaluation of that program might have displaced something that might have been more critical for the department or the government in terms of future decision points—whether a program was sunsetting, whether it was being considered for significant adjustments as a result of requirements from stakeholders or of discussions with provinces, etc.
The renewed policy and evaluation introduces more flexibility to departments to adopt a risk approach and a needs-based approach to evaluation so that they can have pertinent information at the time of decisions rather than at a set time mandated by a mandatory requirement.
Slide 4 explains a couple of key pillars of this policy, with particular emphasis on governance structures and people. Key in our formulation of the governance of the policy was the designation of certain officials within departments for responsibility in the creation of new monitoring and decision-making bodies in departments.
For example, there is the centralized performance measurement and evaluation committee. It's a new committee.
It didn't exist before the policy. It's led by a deputy minister or associate and has the responsibility to monitor the performance and evaluation work of the department and make sure that such information is considered as part of the ongoing management and resource allocation decisions of the department.
The policy also designated within each department a lead for performance measurement and evaluation. In some cases, departments have chosen to make those positions one and the same; in others, they have a separate evaluation function and a separate head of performance measurement.
The policy also made clear that it's not the head of performance measurement or some other official in the department who is responsible for a specific program. It is the functional lead of that program who is responsible for performance measurement and achieving the objectives.
As a final point to note on governance, Treasury Board approves programs through Treasury Board submissions. In the past they have contained bits and pieces of results information that justify the program expenditure and set out what the organization is trying to do. As a result of the policy, we made this a very clear requirement moving forward. There is a results annex in each and every Treasury Board submission that clearly lays out objectives over the short, medium, and long term, with measures of success in the indicators, and the deputy head of the organization signs off on that results annex.
Turning to slide 5, here are some advantages of this new approach. It's a more systematic way of presenting information and of more explicitly tying resource allocation to the results that departments are trying to achieve.
Slide 6 presents this in visual form. You have your core responsibility, your results expectations and indicators, and then the program underneath.
Slide 7 is a visual representation of a very real example of this. This is for the Canada Revenue Agency, which has identified core responsibilities related to tax, benefits, and its ombudsman role. That layer across the top is your core responsibilities, then underneath, you have your results and expectations, and then the performance indicators.
Madam Chair, that was a brief overview of our policy.
After your questions, we will turn things over to Andrew Gibson, who will demonstrate the Government of Canada's InfoBase system, or GC InfoBase.
Thank you, Madam Mendès.
Mr. McCauley and Mr. Blaikie, there's been much discussion and no shortage of confusion over the government's efforts to align the estimates with the budget and over the use of TB vote 40, the budget implementation vote, to help achieve that.
I do take the point that the items in budget 2018 and represented in TB vote 40, the budget implementation act, are not included in departmental plans. That's a truism.
In the same vein, in years past where we've had the main estimates, then the budget, and then a series of supplementary estimates (A), (B), and (C), Parliament has voted in supplementary estimates (A), (B), and (C) for new spending and, in some cases, new programs that were not foreseen in the main estimates and were not part of the departmental plan that was tabled by the department on or before March 31.
If the question is “how do we know what we're voting on because it's not in the departmental plan?”, I would simple point out that this has always been a weakness of our system. It's a weakness that—
Mr. Daniel Blaikie: [Inaudible—Editor] and the voting sucks.
Mr. Brian Pagan: —we're trying to address.
I just want to get to the policy on results. There are a couple of items. In 3.1 it has the expected results.... Actually, I'll read 3.1.2, “Enhance the understanding of the results government seeks to achieve, does achieve, and the resources used to achieve them.” Great. This goes back to, again, though, that the $7 billion is not in the departmental plans. When we've asked the departments, they're not even sure what the money's going to be used for. They're very clear they don't know what the intended results are going to be for the hundreds.... We'll use PSPC as an example. For two-thirds of $1 billion, they weren't even sure what the plan was going to be. They hadn't even developed it, but they're asking for money in advance. It says, “enhance the understanding”.
Under 3.2.1 it says, “Departments are clear on what they're trying to achieve and how they assess success”. Here we have the policy on results that says departments should be clear on what they're trying to achieve and how they assess success. However, when we actually had departments in front of us, including ministers, they were unsure, saying they hadn't formulated the plans, but to give them the money up front.
This goes back to your previous comment about how money not spent will show up in the winter supplementaries. We usually had the opportunity, in the supplementaries, to actually question departments and question ministers on that. The vote 40 issue takes that away entirely.
Again, it seems to be contradictory things. We have vote 40, which is supposed to be better transparency for us, and, yes, I understand what we're trying to do, get the money out the door faster. But you have your own policy on results stating that departments should be clear on what they're trying to achieve and how they assess success. They sat before us in this committee and said, “Well, we don't know what the success is. We don't know what the money's going to be used for.” One of the departments said they didn't even know how they came up with the budget; they were just told to put the number in and to ask Treasury Board.
You know, I understand what you're trying to get at, and we appreciate that, but it seems that we have the cart before the horse, or we have the policies before the horse. It's almost as if we have these policies that don't reflect the reality.
Good afternoon, everyone.
I'm here to introduce GC InfoBase. It is an online tool that presents data on finances, people management and now government performance.
I will begin by presenting the tool, the principles it is based on, and recent changes relating to performance.
Four principles guide the evolution of the InfoBase.
First, we want it to be a reliable source. All figures presented can be found in a published source, such as the Public Accounts of Canada or the estimates.
Second, we present the information in a nuanced and comprehensive manner. There are several general presentations on the data so that the user can understand them as a whole. There are also detailed presentations for people who want more details.
Third, it is critical that GC InfoBase does not interpret or give an opinion; the data speaks for itself. Government organizations may submit background information through footnotes, but the data are presented in the same manner.
Finally, it is a tool that continuously adapts to users' needs. We are always looking for new techniques to communicate and visualize data.
Those are the four guiding principles.
What you have up on the screen in front of you is the main page. I'm going to walk through the different sections of the tool.
The main principle behind the InfoBase is that we seek to “re-present” information over and over again, in different ways. When someone cracks open the public accounts, it might be something really relevant to them, or their eyes might glaze over because there are a ton of tables and a ton of numbers and it doesn't tell them a story.
What we're seeking to do is to say that some people like tables and numbers—that's great—some people like to have the information described to them with words—you'll see that—and some people like to have pictures drawn for them—we're also doing that in the tool.
The finance section begins with a government-wide overview. Then there is a department and even a program, just to see how much data is available. Then there is the data on people, and again it goes back to the government level. Then we pick a department. Then there is the government's performance data. Once again, we will choose a department just to see the data entered at this level.
After that, we're going to look at some featured content.
I noticed a moderate amount of interest at the table around vote 40 and where that information is going. We're going to show you what we've done to drive more transparency around that, and I'm going to be talking about what more we're going to be doing. I'll talk about my favourite giant picture, which kind of lays out the government so that people can understand what information is available.
We're going to start with finances.
I should also say, there's a section here where, if someone just wants to search for a word, a department, or program name, they can jump straight to it. We are really trying to be inspired by all of the work that's done out there in terms of the Google app and all that stuff. They want to make their product as easy as possible for people to use.
There's always this weird tension. A while ago it used to be that you would come to work to use the Internet, and now, you go home to use the Internet. We're trying to make sure that the kinds of tools you get to use in your personal life are also available to you at work.
Going to the finance section, this is at the level of the Government of Canada. I'm not going to go over everything because I will bore you all to tears with that level of detail. If anyone wants me to stop, I will, and we can focus on a particular thing.
I'm sure some of you have noticed that the government likes to use a lot of complicated, technical terms when it talks about how it spends money. For those who aren't familiar with all the definitions, we have this hidden section here, where we lay out all of the definitions. We don't want to stop people from actually learning something, so we don't slap people in the face by making them read that right away, but it's there if they need it.
What you have here with these blue squares is what we call our welcome mat. It doesn't look like much, but it actually combines hundreds of reports into one kind of Government of Canada view. What you can easily see without having to open any books or do any homework or anything like that is how much spending was five years ago. This is a number you'll see in public accounts. You'll also see how many FTEs were employed in the Government of Canada last year, and what are we planning to spend in fiscal year 2019-20.
Going down, what you see here is a split of spending and FTEs by Government of Canada spending areas. You have economic affairs, social affairs, government affairs, crown corporations, international affairs, and internal services. Right away you can learn some interesting stuff. You see, for example, that while economic affairs is the highest spender, social affairs has a large number of FTEs. If you're particularly interested, you can just click on one of these bars—I clicked on social affairs—and you see a split by the particular Government of Canada outcome area, so you can see that breakdown.
Also, I should have said that, at the bottom of every page, or the bottom of every one of these little panels, you see a link where you can actually go and see the raw data behind all these graphs.
Moving down, you see authorities and expenditures. This is basically what was approved by Parliament and what was actually spent. This is as per public accounts for the past five years. You will see the current voted and statutory split as of the main estimates that have been tabled for 2018-19. It just tries to lay out graphically the largest areas of statutory spending for the Government of Canada, so someone can see that old age security and the Canada health transfers are by far the two largest bubbles in statutory spending.
Moving down, you see the details on voted estimates. You see—no surprise—that national defence is the largest recipient of voted authorities, followed by Indigenous Services Canada and Treasury Board with all of our fancy central votes.
Moving along, you have a five-year history of transfer payments. That is the largest particular area of government spending. I'll go over that again when we get to one of the other visualizations. You can see a breakdown between grants, contributions, and other transfer payments. Then you can see a five-year history of personnel expenditures for the Government of Canada.
That was one view, and what I said in the intro was that we were going to then dive down and pick a particular department. We're going to do Parks Canada.
You scroll up to the top. You start typing in “Parks Canada”. It auto-suggests it for you. I'm not going to go over it in the same level of detail. I'll just touch on where things are a bit different.
You have the same welcome mat, but now it's focused just on that particular department. You get to see the relative size of Parks Canada compared to the rest of government. You can see that same five-year history of authorities and expenditures. Here is something where you can look at a history of the standard objects of expenditure—for example, if you want to see what's been going on with personnel spending. You might also want to turn on transfer payments. I can turn on other parts of this so you can get a dynamic way of looking at where Parks Canada's level of the department spending money is. Going down, you can see where Parks Canada is in terms of transfer payments. I'm just going to skip a little bit, though, to the more interesting thing.
This is a transparency initiative that we started last year, and it's information that is actually not in any published document, and it provides a whole new level of transparency at the program level. What you see here with that horizontal bar chart is that each bar represents a program and it's split into the types of spending for that particular program. If we want, for example, to just focus in and see what the personnel spending is for all programs, there it is.
If you want to drill in and see professional and special services spending, you can get that. This is not information that is published anywhere else, but it does provide a lot more insight and it lets you understand at a much greater level of depth what a particular program is doing.
What you can do from here is you can say visitor experience is interesting; it's a program you might want to know more about it, so you can click on that and you have yet again another infographic at the—
At the level of a program, you have the same welcome mat; you can see what's going on with this particular program. Going on, you have authorities and expenditures and you have a nice pie chart that says, this is the kind of program this is: it is a program that spends most of its money on personnel; on acquisition of land, buildings, and works, which makes sense for Canada; on professional and special services; and then on all the other spending objects. It's just an extra level of detail so that someone can understand what exactly is going on in this program.
Jumping up to the top, we're going to jump back to the level of the Government of Canada and are going to look at people management data.
This is something nice, because we added a bunch of employment equity data recently so that someone can get an even more detailed sense of what's going on with a particular department.
At the level of the Government of Canada, you can see a five-year history of the head count for the federal public service; you can see a nice dynamic map that shows all federal employees; if you mouse over a particular province, such as British Columbia, you can get a five-year history of the head count within that province.
I'm going to jump back up. Let's pick Canadian Heritage. Here you can start seeing some interesting stuff. Again, you can see Heritage in terms of head count compared with the rest of the government—a five-year-history head count for Heritage. You can see where Heritage employees are distributed across the country: the majority are in the NCR; there are 39 in British Columbia. That head count hasn't changed too much over the past five years.
What's interesting here is that now you can look at demographic trends for Heritage Canada. If you want to see what has been going on with the age composition of this department, you can see that there has been a slight increase in the number of employees aged 29 or less. You can also then go over to average age and see how Heritage Canada compares with the rest of government in terms of average age. Some departments skew older, some skew younger. It's interesting contextual data to understand the department.
You can also see the composition of the EX cadres. You can see what the different levels are for EXs who are managing this department. If you want, you can click on the non-EX box and see what the ratio is between employees and management for that particular department.
We have first official languages. You can see that for 60% of employees in Heritage Canada, French is their official language, and English is the inverse, at 40%.
You can also see the gender distribution: 66% of employees are women, and 33% are men.
Moving over to results—and we'll go back to the level of the whole government again—you see our terms and definitions and stuff like that. We have our promise that departmental plans from 2018-19 are going to be updated here as soon as we can finish processing all of the data. This is a transition year between the old policy and the new policy, so things are moving a little more slowly than normal.
The first thing we show you is a handy-dandy summary in which we break out results in two categories. This is data as of the departmental reports that were tabled in the fall. You see that as of the fall, the larger chunk of indicators tells you whether results were achieved that were due at the end of that fiscal year; the smaller one, underneath, is for results that have a longer target. You can see where attention is required. In some cases, they've decided the indicator is no longer necessary, or their information isn't there.
Below that, you have the largest departments, in terms of indicators. If you look at Canada Revenue Agency, you can get a mini-report card just for Canada Revenue Agency. Underneath, we have an interactive way of exploring data within the results data. This tool is in an awkward, adolescent phase right now, because we still have data as of the old policy—those are part of the departmental reports that were tabled in the fall—and we have planning information under the new policy. We're having to bridge the two policies.
You'll see here reference still to the idea of “sub” and “sub-subs”, because that was under the old policy. What this allows you to do is explore and look at all the programs, the subprograms, and the results. If you're someone who is very positive in nature, you can actually click on a filter to indicate that you only want to look at programs in which the result is being achieved. If you don't share that other person's rosy view of the world, you can change and just look at areas in which attention is required.
I'll pick, for example, the subprogram “Service Complaints“ to show you what you can do.
If we open that up, you can see the result they're trying to achieve is that taxpayers receive a timely resolution to their service complaints. You can see that they were targeting 80% and that they met it. They actually achieved 83% in terms of percentage of taxpayer service complaints resolved within 30 business days.
Just to show you the through line, if we move over to “Planned Data”, so if we change the tab, this is “Planning”. There's no actual results information in here. You can see that under “Tax”, you still have “Service Complaints” as a program, and you can see they're still seeking to receive the same result, and they continue to target 80%.
That will be reported in the fall, when those reports are tabled.
So this gives you a flavour of not only the financial people management but the results data. I'm going to go back out to the main section. How am I doing on time?
So going down to the featured content—I'll keep the budget funding for the very end— this one here is near and dear to my heart. I very much like how this lays out the government. This was inspired by the tool that Steve Ballmer does for the U.S. government. We took a look at that and thought we could build something a little cooler. So this is dynamic, and you can kind of open up different areas. What this particular slice of data shows you is broken out by ministry, what are the largest spenders for the Government of Canada. You can see it's Finance, ESDC, National Defence. No surprise. I should say also that you have “Ministerial Portfolio” in the first column. Then you have organizations and then the actual programs. If you want, you can click on a program, and you can get a description of it. You can see what the spending was, the number of FTEs, or you can go straight to the infographic. It really lets someone see the global view, find something interesting, and then dive in.
There are a lot of different data slices you can do here. My favourite one is in terms of types of spending. This can tell someone right away what the Government of Canada spends money on. What you see here is that by far the largest block is transfer payments. Below that is personnel. If you flow along for personnel, you can see the top four spenders of personnel are defence or security and justice-related, and underneath that, are professional and special services. So if someone asks what the Government of Canada spends money on, it's transfer payments, it's salaries—a good chunk of those salaries goes to defence and security and justice—and after professional special services. It really helps someone come to grips with the huge amount of activities that we undertake.
I won't belabour this too much, even though it's 100% my favourite part of this tool. I could go on. We've kept that same design aesthetic for tracking budget funding. So what we have here is—this is as of the budget—something we're going to be adding to substantively over the course of the year. In its current state, what you have in this first kind of candy-stripe column is the actual vote 40. This represents the $7 billion that lines up with what was in budget 2018.
The next column contains the budget measures. You can see, based on size, what are the larger budget measures, and then you can see the organizations that are receiving funding. Where there is more than one organization, you can just click on the little plus sign, and a diagram will open up and you can see. If you want, you can click on the particular budget measure, and what we did is we went in and tore apart the budget for the users, and where there was relevant text that lined up with tables 8 to 11, we got that text for the users and it's available in the diagram.
If you want to reorganize this, you can instead say “I want to see all of the budget funding that a particular organization is getting.” So here you can see that indigenous services is getting the largest amount of budget spending. You can open it up. You can see all the different initiatives. If you want, you can filter on a particular chapter within the budget, if you just want to see the initiatives that were in the growth chapter.
This diagram is as of the budget. Over the course of the year, we're planning to add an extra column and we're going to say that, as initiatives are improved by TB, we're going to capture data from TB supplementaries that are provided by departments, that says, “Of the money that's being approved for this particular budget item, these are the programs that are getting it, and this is how much each program is getting.”
It should be very useful for Canadians and parliamentarians to see exactly where budget money is being allocated.
There is a bunch more content, but I think I'm at the end of my five minutes.
Thank you, Madam Chair.
Actually, could you put that slide back up, the last slide that you had on the budget? I'm going to go to Mr. Pagan. I'm going to try to explain what I understand is happening with this $7 billion and how transparent you are, because there's been a lot of discussion, and I think this tries to explain it, at least in my mind.
I looked at the $7 billion as an approved line of credit for a project that I have for my home. I know what items I'm going to use that money for. Then, at the end, at some point, I come back and say, “I used this line of credit for this type of activity,” and I'm able to explain for whom, when, and how much I spent. I'm trying to bring that down for when I go and talk to my constituents and this question is raised. I'm trying to simplify it in such a way that I can talk to my constituents.
First of all, $7 billion is a big number. Second of all, everybody is concerned that this is going to disappear. Most of my colleagues have talked about that. There is no transparency.
Can you help put it into a language that I can use to go and talk with my constituents, to be able to explain it and then translate it into...? Now that I have a tool, I can go and tell them, “Look, this is what it is, and this is what's going to happen.”
I have a technical question about InfoBase, particularly about the screen you're showing.
We're looking at the approximately $7 billion that appears in vote 40, and then you say that you supplement some of the information here with information from the budget, including proposed expenditures. There is a lot of information in the budget that doesn't appear in table A2.11, including information in the back tables for various chapters.
For instance, in the back table of chapter 4 there's apparently a suggested amount for “Establishing Better Rules to Protect the Environment and Grow the Economy”, and it's anticipated that the government will spend $125 million in this fiscal year on that initiative. This doesn't appear anywhere in table A2.11, so authority is not being sought for that spending under vote 40.
Is this information about budget 2018, or is this information about vote 40? What I'm trying to suggest is that these are actually two different things. It's a little bit misleading to say that it's information on budget 2018, because there are many things in the budget, including in the budget tables in the back chapters, that one would think were part of the budget but that aren't part of table A2.11.
That's fair enough—table A2.11 has to do with what we're seeking authority for—but I think this helps emphasize some of the differences between the estimates process and the budget document, a distinction that is being muddied, I would say, with this new vote 40 mechanism.
—which mostly corresponds to the annex in the main estimates, but actually not perfectly. There are a couple of discrepancies.
I'm just saying that this is supposed to benefit me as a parliamentarian and provide some clarity. I think overall there probably is more information, but it's not exactly self-evident. You have been using the term “truism”. Truisms are things that are so obviously true that they are hardly worth mentioning. I wouldn't quite say these are truisms. They may be true, but they are not truisms because it actually takes a bit of work to figure out what's going on, what's represented, and what's not.
I might spend a little bit of my extra time, for the benefit of the committee, running with the analogy of the house renovations and the contractor. Analogies are always prickly things, but I do think the proper characterization of that analogy is not that you're going for a quote, and you're getting it back, and the contractors says he or she expects to spend this and then later reports what the actual expenditure was on the product you wanted. The analogy is a contractor who says he or she would expect to spend this on the roof, and this on the walls, and this on the flooring, but it depends what flooring you pick.
The contractor is asking for approval for a certain amount for the flooring, and a certain amount for the windows, and a certain amount for the roof. Then they're going to go away and make decisions. They're going to make decisions about what shingles you're going to have, and what windows you're going to have, and what flooring you're going to have. When the contractor is done, they will come back and let you know what they did, and they will table the receipts.
As somebody who's living in the home, I would expect you would want that contractor to come back to you and say they've done research on the flooring and here are the different kinds of flooring you could have, and here are the price points. Here's what's in your budget, and here's what's not. Now the contractor would like you to approve the particular things they're going to do with this money.
I would expect that, if a contractor came to me and said here's the budget for the flooring, they're going to go away and pick the floor. They can't spend more than you authorized; they can spend less; but when you come home, you will have new flooring. The contractor will have decided what it is. At that point it's too late to do anything about it other than approving more money for the contractor to rip out the floor and put in the floor you want.
The point about parliamentary oversight is that parliamentarians are supposed to have a pretty good idea of what the money is going to be spent on before they approve the money. If they just say there's a program and they like the idea of that program and that seems like a reasonable number even though you can't fill in all the details, go ahead and spend the money and just report on how you spent it, that wouldn't be good enough for homeowners. They would expect far more hands-on decision-making power in terms of the details of what's being done under that budget item.
I do think it's an apt analogy, but I think it has to be rounded out. I won't speak for my colleagues here, but what I'm asking for—what the NDP is asking for, and I suspect maybe others as well—is it's well enough to say here are our projected expenditures for the year and here's more or less what we want to spend it on. However, in the old process—that is to say, all the approvals that have happened heretofore—parliamentarians, before they approved the money, have been able to put the question to the minister who has already gone through Treasury Board, already figured out how many EFTs are going to be there, and ask those questions before approving.
One of the things that we have here.... I mentioned the idea that you have program tagging. This is something that we made a big hoopla about last year when we presented to the Standing Senate Committee on National Finance. We showed this new tool. What you have is this whole mix of programs. This speaks to one of the nice changes that was brought about by the policy on results.
Previously, you had a lot of detailed information that was siloed in all the individual departmental reports. If someone came in, for example, and said, “I want to create the virtual department of clean drinking water so that I can understand what the government is doing to ensure that all Canadians have access to clean drinking water”, that person would have had the joyous task of opening 80 or 90 reports and one for each year. That person would have had to go through and ask, “Is this a program related to this? Is this a program...?” It's very manual and very error prone.
What we're doing to change that under the new policy is we're bringing all this low-level information to one place. We let someone come in and say, “I want to just kind of subdivide the world this way. I want to use this tagging scheme to pull together programs.”
What we currently have here are four ways of reassembling the government into different collections of programs. The first two are the classic ones. We have appropriated federal organizations by ministerial portfolio and by just an A to Z listing, so that's not particularly interesting. Then we have two of these new tagging things. We have what we do. We kind of reassemble the government based on....
Here, I'll turn this on instead of talking about it. I showed this before in terms of that bar graph. We have economic affairs and social affairs. What you can do, though, is drill in and see under economic affairs the following activities: employment and income security.
If you open that up, you get a description of what that is, and you see all of the programs that contribute to that particular area. If you want, you can click and see an infographic. You can see that this is the historical and planned spending for this particular sub-area. We've taken all the programs under that tag, and we've reassembled them into a pretend department of employment and income security.
We'll be adding to this over the course of the year. We're going to be adding another tagging scheme: client groups. I mentioned this before. If you want to see all the programs that support seniors, if you want to see all the programs that support youth, that's going to be there.
One of the challenges with some of these tagging schemes is what's called double counting. If we go back out—and that sounds ominous, but it's not—we have this thing called “How we help”. This is a tagging scheme where we show the different ways in which a program delivers its service. Is it a program with service for Canadians? Is it a transfer payment?
Some programs deliver their services through multiple channels. Some programs support more than one client group. You might have a program that supports both aboriginal and employers or something. If you were to add up all the programs under there, it might look like we're saying that the government is actually spending three times the amount. This is why we have these warnings that say that this is a tagging scheme where programs have been tagged with more than one particular tag. We're showing the up-to amount. This is the largest amount of money that's being spent for youth, for example. These are all the programs that have been tagged as supporting youth, but they do other things as well. They might also be supporting seniors. You would see some other spending there.