Thank you for the opportunity to address the House of Commons Standing Committee on Natural Resources on the concern regarding energy data in Canada.
I'll start my remarks with my recommendation that Canada establish an independent energy data agency using a governance model similar to that used with respect to the U.S. Energy Information Administration.
We have been here before. Numerous organizations in the energy sector have noted for decades the need for an independent energy data centre. More recently, we have seen concerns raised about the lack of such an agency.
In 2012, Michal Moore, from the University of Calgary, published “A Proposal to Create a Pan-Canadian Energy Information Organization”. Also that year, we had this glaring headline from a Financial Post journalist, Jameson Berkow: “Finding information about the Canadian energy industry is easy—if you go to the U.S.” In addition, in that same year, the Senate's standing committee on energy and environment stated, “It's Time for a Canadian Energy Information Agency.”
In 2015, the Canadian energy strategy, agreed to by all the provinces, included goal 3.1: to “[i]mprove” the “quality of energy data across Canada”. That led to a discussion between the deputy minister of Alberta Energy at the time, Grant Sprague, the assistant deputy minister of energy at NRCan, Jay Khosla, and me. We agreed that it was important to reach out to stakeholders from across the country to see what they wanted regarding energy data and the status of data in Canada.
During that time, CERI produced an assessment of the data challenges we face in Canada, which include the following ones.
There is a lack of data. Only 38% of the 189 potential indicators are gathered. In particular, we lack information on emerging technologies and new energy services.
There is incoherent data. For example, we found at least 10 different definitions for GHG emissions.
There's inconsistent data. Out of 26 indicators assessed from various sources, 42% differed in value by more than 10%, so it is difficult to determine which source is correct.
We also found data lacking credibility. A CERI survey found different levels of trust by stakeholders regarding organizations that produced data or analysis. That percentage of trust varied as follows: government agencies, 67%; governments, 17%; economic experts and academia, 50%; and, industry associations, 42%.
There are also data gaps. To generate a complete set of data requires a review of up to 20 sources of major and minor publications, and that's beyond the resources and expertise of most stakeholders.
The data is not timely. Sixty-one per cent of data is available after one year, so the rest is still waiting, which means that trends in the data are difficult to produce, as is seeing where we are at any one time.
The full data gap analysis report is provided to the committee as background document A. You should have it in your package.
CERI, the Ivey Foundation, and the Trottier foundation worked together to gather interested people from across the country to discuss what needed to be done. In 2017, after two years of discussion, we came to some clear determinations of what was needed, but no one at the time was willing to put funding towards achieving an energy information organization.
The stakeholders were unanimous in their support for an independent and neutral agency with some analytical services. I've brought a summary of the overall discussions as background document B. You should have that.
CERI worked to reinforce our understanding of stakeholders by conducting a survey regarding the need for an energy information organization. The results of that study are included as background document C.
To crystalize our thinking on this matter, CERI developed a business case for stakeholders to reference. The full document is attached as background document D. However, the main responsibilities of an energy information organization would be threefold: data management, analysis and reports, and communications.
Data management would include up-to-date use of artificial intelligence tools and machine learning and cover things like data clearing and quality assurance, data reconciliation and harmonization, ensuring relevance and timeliness, and data gap analysis and filling the gaps with research.
The second part was analysis and reports. From looking in the past we would conduct analysis of historical developments and trends. The current would look at market monitoring and assessments, and the future would look at scenario analysis unfettered by existing policy.
From a communications perspective, we would ensure unrestricted access to information and make sure the information was shared across organizations in the country.
Key to the success of such an organization is an open platform for sharing data. Many organizations and governments in Canada gather data. We should leverage these activities and the value they create by forming a collaboration among the parties. This can build trust, which is vital for the data being gathered, and promote the use of this information as a source for evidence-based decisions by government, industry, indigenous groups, and environmental organizations.
This country is in the midst of a transition to lower-carbon energy systems. Important decisions are being made that will affect the lives and businesses of us all. Without a comprehensive and credible set of data that we all recognize, those decisions and that transition will be more challenging.
The Pembina Institute is an intensive user of energy and climate data. On a daily basis, technical and policy experts in our organization are accessing and analyzing energy and energy-related data from various federal and provincial agencies. This includes analysis to support the development of methane regulations, to re-envision freight transportation in the GTA, or to provide recommendations on how to decarbonize electricity generation in Alberta. As a result, we have a practical understanding of the issues and limitations of energy data in Canada.
In addition, our organization has a long-standing support for evidence-based energy and climate policy. We believe that high-quality data and analysis are a critical component of informed public policy, and we know that in Canada we have work before us to ensure that decision-makers from government, but also from businesses as well as civil society, have access to timely, complete, and independent energy data and analysis.
First, I would like to explain why good energy data is crucial in a context of energy transition. Our energy systems—that is, the way we produce and use energy—are under huge pressure to change, and they are already changing at a pretty fast pace.
A first cause of this change is the imperative to develop low-carbon alternatives to address climate change. A second cause is purely economic: some zero-carbon options have become more cost effective than hydrocarbons. Wind power, for example, is now a cheaper option than natural gas to produce electricity in many jurisdictions. A third reason is the emergence of new technologies. Think of smart grids, for example, or the rapid change in behaviours that affect energy demand. Think of millennials, for example, who drive significantly less than previous generations.
For all these reasons and many other reasons that I won't have time to expose here, Canada's energy systems are changing very fast. As a society, we need to better understand and document this change if we want to accelerate the energy transition and seize the opportunity offered by the new economy.
Energy data in Canada suffer from many flaws, and these flaws impede our ability to develop efficient and effective policy, to suggest alternative policy pathways, and to fully support the rise of a clean economy.
First, there is a quality issue. Some data made available are inconsistent. For example, StatsCan reports that Quebec is using significantly more natural gas than it is supplied with. Second, there are missing data. Some public reports do not include major energy sources. For example, the report on energy supply and demand, which is the go-to place to understand how we generate and use our energy in Canada, does not report wind energy, solar, and biomass as energy sources.
Quality and missing data are only two of the most frequent flaws. I won't have time to develop here all the issues at stake, but we could easily also add consistency across datasets, confidentiality issues, transparency issues, granularity, and availability in a timely manner, etc.
We have entered a data-driven world. We need credible and reliable energy datasets to develop informed, transparent, and accountable policies in order to accelerate the energy transition as well as the rise of the new clean economy in Canada.
I'll now briefly go through some of our recommendations.
In our role as advocates for evidence-based policy, we have recently undertaken a series of expert interviews and developed recommendations for modernizing the National Energy Board. The state of energy data was raised repeatedly in those interviews.
Combining what we heard from the 23 experts we talked to and what we know from our own work, we recommend harmonizing and aggregating data across federal, provincial, and territorial agencies; producing energy data with aligned timing, units, and assumptions; expanding the scope of collected data to include, for example, demand-side energy data, information on international and interprovincial energy trade; data that is as granular as possible and in formats that can be easily disaggregated and manipulated; improved quality assurance and, where applicable, information on assumptions and inconsistencies between datasets; and reduced time lags between the collection and publication of data.
To implement these recommendations, we further recommend the creation of an independent Canadian energy information agency.
A second finding from our research was the issue of independence and the need to separate data collection and analysis functions from policy-making and regulatory agencies.
In the case of the National Energy Board, we have the same body that is evaluating and regulating a project also producing the energy and supply and demand forecasts that may be used to evaluate that project. This creates a situation where the forecast used to determine project feasibility may not be viewed as sufficiently independent. The NEB expert panel also recognized this issue and recommended creating an independent energy information agency. We agree and would like to highlight that there are two separate functions that need to be considered: data collection, and data analysis.
We recommend separating these two functions, expanding energy data collection at Statistics Canada, and creating a new Canadian energy information agency charged with disseminating energy and climate data.
We envision the new Canadian energy information agency relying on Statistics Canada for the collection and harmonization of energy data. This is consistent with the mandate of Statistics Canada, and it makes use of existing data collection capacity, expertise, and relationships with provincial governments.
The new energy information agency should be housed within Natural Resources Canada and report to the Minister of Natural Resources. However, the independence of that agency should be established by specifying, in legislation, that the agency does not require review or approval of its statistics or forecasting by any government entity.
The mandate of the agency should include reporting quarterly on energy supply, demand, sources, and downstream consumption, including international and interprovincial energy import and export; producing annual scenarios for energy supply and demand, including a reference case that considers domestic and international action on climate change; producing an annual report on Canada's progress towards fulfilling its commitments to addressing climate change; managing a coordinated interface, a one-stop-shop platform to disseminate all energy data and analysis; making all data available to the public at no cost in easy-to-use formats; conducting proactive energy education to increase energy literacy; participating in project hearings as expert witnesses with respect to energy and GHG emissions modelling; and finally, advising government ministries and agencies on energy matters upon request.
In conclusion, we would like to thank the committee for initiating this study. Ensuring that we have high-quality energy data and analysis is essential to good decision-making, and is especially critical in the current context of rapidly changing energy systems.
We look forward to the recommendations that come out of this study and thank you for the opportunity to participate in the discussion.
There was silence, right? They said, “Let's get the federal government to do this.” That was basically it. They all love the idea, but there's very little money floating up to put towards it.
Would it last in the private sector? I don't think so, unless there were some assured funding streams on which it could depend. If there were an allocated funding stream from all 14 levels of government, I think that would work. Having certain caveats or what have you associated with the funding such that it would have to meet certain performance standards, for instance, that are out in the open and clear, I think would be important.
I go back to the way the government is working right now, and there are three issues that come to mind. Two deal with StatsCan and one deals with the National Energy Board.
On the two with StatsCan, we're all aware of the long-form census debate, and that was resolved by the chief statistician resigning in protest because the government had a requirement to make some changes that didn't seem, in his opinion, to be appropriate. That's an example of where government gets directly involved in that activity. A second one was the impasse between StatsCan and Shared Services over the computer systems. One thing that I think would be vital in having something be successful is a top-of-the-line new type of artificial intelligence that anyone can use with very little training. I don't see that happening any time soon within the federal family, given other issues such as being paid on time, for example—figure that out first and then look into the future.
In terms of the National Energy Board, some of their analysis done several years ago was limited to current policies at the time. Their scenario analyses, which may have looked at different options, weren't allowed to include alternative policy scenarios.
Those are examples of where it would be difficult to work in the private sector and is difficult to work in the public sector. Either one will work if you resolve some of those issues: funding in the private sector and legislative independence in the public sector.
I just want to highlight something very quickly. It's not even so much as a question as it is a statement.
The NEB's modernization panel studied this issue and added something. You know, I've sat on public accounts, and every department is struggling with data. It just seems to be this overarching problem that's across the broad strokes of government. We can never seem to have relevant data in a relevant time frame that allows us to make accurate decisions.
Their recommendation was that an independent organization that would be federally funded and at arm's length from government could take all the relevant data from the sources that were willing to participate with it and use that data to make better and more informed decisions about how we create energy policy in this country.
Just in this last five minutes, we've been talking about how we're utilizing data from five or six years ago, and some of it is from eight years ago. That's crazy. If I told that to somebody in the private sector, and I was working for them, they'd just say, “Get out”, but because it's government, you can get away with that. It makes no sense at all.
The reason I made the earlier comment that it should not be within the purview of the NEB in my opinion is that I believe it needs to be an arm's-length institution that's not looking for funding but is simply there to give Canadian people relevant data so that they can form an opinion about energy policy in this country.
That's all I want to say.
My name is Bruce Lourie, and I'm president of the Ivey Foundation.
The Ivey Foundation is a 70-year-old philanthropic granting and policy research organization. We have today a programmatic mission to help Canada transition to a low-carbon future using evidence-based policy and communications. Our goal is to better integrate the economy and the environment. We're basically an independent organization that provides grants and undertakes research in the low-carbon economy, and we support many of the groups that are working in Canada , such as the Ecofiscal Commission and Clean Energy Canada. I've done a lot of work with a previous speaker, Allan Fogwill, at CERI. We basically work with groups across the country.
Our work has included—and I think Allan mentioned this—convening experts across the country. We've done workshops in Toronto, Calgary, Ottawa, Montreal, and Vancouver, bringing together energy regulators, energy companies, provincial governments, Statistics Canada, the National Energy Board, NGOs, expert private modellers, experts from the U.K., and experts from the U.S. basically to help understand how we can create a more rigorous energy-data and energy-modelling capacity in this country.
One of the things that we discovered in that—and I'll share just a couple of anecdotes—is that if you're a Canadian energy-resources researcher or an energy-systems modeller or a climate-policy consultant, I'm not sure if you know where you are likely to get your data and information from. It's not Natural Resources Canada. It's not the Department of Environment and Climate Change. It's not Statistics Canada. It's the U.S. Energy Information Administration. Most of us in Canada doing research on these things actually have to go to a U.S. department to get the data that's compiled for us by this American government agency. Of course it's worth considering that, given the geopolitical context right now in North America, and given that we're doing things like negotiating NAFTA, I can almost guarantee that some of the information that we're using in those negotiations was generated by the U.S. government. I think that's a situation that clearly needs to change.
Canada's energy information systems were once very well regarded, but I'm going back many decades now. The support for science and energy data began to unravel in the mid- to late 1990s with precipitous declines through the early 2000s and up until very recently. I heard a story recently of two federal government climate change policy experts, one in Canada and one in the U.S. The Canadian was saying, “We wonder how in the U.S., with all of your data infrastructure, your energy experts, your think tanks, and your sophisticated models, you cannot come up with any national climate policies.” The American replied, saying, “That's funny. We always wonder how, given that you have none of those, you still manage to in Canada.”
We have a challenge in this country. The reality is we're—
The reality is that we're kind of bumbling along blindly, and by blindly I mean we have limited information. We have a lack of access to that information, and a lack of transparency around the information we have and how it's used.
I really wonder how we can have an intelligent debate in this country on the emissions potential of carbon pricing if we don't have the data to understand that. How can we have an intelligent debate on the environmental effects of pipelines if we don't have the data and we don't agree to a common set of data across this country?
One of the reasons why people always wonder how industry can say this, environmental groups can say this, and governments can say this—and they all have different answers and different numbers—is because we don't have a shared set of common, adequate, high-quality data in this country. Given that we often talk of Canada being an energy superpower, the reality is that when it comes to energy data, we are anything but an energy superpower.
I would go so far as to say that if some of this information and institutional infrastructure had been available 10 years ago, or perhaps even two years ago, we probably wouldn't be in the energy policy mess we're in today.
There are some great international examples, and I'm going to touch on one. Imagine a world where energy and climate data are made available to researchers, industry, and NGOs alike. Then, imagine well-funded modelling experts producing multiple sophisticated energy and economic models to inform policy. Add to that what I can only describe as the pinnacle of evidence-based policy-making, a group of independent experts with the mandate to set long-range climate policy goals based on expert models and scenarios, and that this evidence produces carbon budgets extending 10 years into the future, and that this is used by governments and evaluated, with the results fed back into those models to inform the next five-year planning window.
This isn't fantasy; this is exactly what the U.K. does. It's called the Committee on Climate Change. We know it can be done, and this frankly isn't a complicated matter.
Everything is politically complicated in Canada, but technically this isn't that challenging. The U.S. does it. The Energy Information Administration in the U.S. does it. The International Energy Administration in Europe does it. The U.K. does it; they do a very good job of it. I think we have a clear consensus in this country, based on the research we've done, the convening we've done, the experts we've talked to, that it isn't that hard for Canada to move forward on it.
Under the pan-Canadian framework, we've committed to an expert engagement mechanism to support climate policy with independent advice. It's been excruciatingly slow for that to develop. I think that needs to move forward. We don't just create energy data because it's fun having energy data; we create energy data so we can use it to inform policy. That's another mechanism that's needed.
I'll end with three main points.
One, we need to build the energy information capacity in this country, and we strongly believe that creating a Canadian energy information organization is something that's well needed. This needs to be an independent entity, and conceived as a partnership between groups like Statistics Canada, NRCan, Department of Environment, and other relevant energy data groups, experts, provinces, and the private sector across Canada.
Two, we need to support independent modelling and analysis. Our capacity in this country has declined to a state where we basically don't have the fundamental modelling and analysis we need to make these complex decisions. I know you heard from David Layzell at University of Calgary a couple of weeks ago, talking about a transition pathways initiative. We are strong supporters of that, and believe that kind of cutting-edge work is needed in this country.
Finally, we need an independent climate expert institute in Canada that will use the modelling and analysis to help provide the policy advice we need to the country. Again, it needs to be independent, transparent, evidence based, and expert. This was referenced in the pan-Canadian framework, but it has been very slow to emerge.
I've provided a couple of little graphics in my speaking notes that you can look at to see how these things might fit together.
With that, I'll thank you very much for considering this important issue and for having me present today. Merci.
Thank you for inviting me. My apologies: there were some pictures to go with this, which you will eventually get.
The driving of the last spike in 1885 was the culmination of a nearly two-decade effort to bring certainty to a nation. It forged a national identity in steel and steam, in iron and timber, and 150 years later, Canada, a prosperous nation with strong linkages to the south and opening markets in Asia and Europe, seeks certainty in the development and delivery of energy data.
Why is data important? I guess you've had a lot of presentations on why it's important. I'm sure that in your daily life you do know.
For the next 10 minutes, I'll outline why an agency that can provide timely, reliable, and transparent energy data is necessary. I'll discuss the necessary elements of data management, acquisition, and sharing, define leadership gaps in transitioning to data-driven decision-making, and the steps to greater energy certainty, not only as a national policy but as a national imperative.
In preparing for this presentation, I came across a May 2017 Economist briefing that drew a striking analogy between data and oil. “Data”, the authors propose, “are to this century what oil was to the last...[the] driver of growth and change.” They continued:
||The new economy is...about analysing rapid real-time flows of often unstructured data...photos and videos generated by users of social networks, the reams of information produced by commuters on their way to work, the flood of data from hundreds of sensors in a jet engine.
The statement resonates with me as someone with a lifetime of experiencing work in and around the mining and energy industries, first as a researcher-scientist and oil production worker, later as a communications professional and atomic radiation worker, and, for the last 16 years, as an investigator of railway and pipeline accidents.
Why is data important? Or, more importantly, why is a national data agency necessary?
As the article hints at and a colleague of mine recently told me, the fundamental problem is that we're not getting snapshots of information about energy fast enough to make informed decisions about things such as energy planning and environmental impacts and such. It might be easier to ask, what are the costs of not having a national energy data agency in Canada? I think we're all living that.
If you had my pictures right now, you'd see a picture of the Mackenzie Valley, where the pipeline was first proposed as a joint venture partnership. A new effort came forward 27 years after that, which finally I think overcame the barriers related to aboriginal and industrial co-operation. As most of us probably also know, in December 2017, after a six-year-long process to reach approvals, the partners walked away—again.
We're losing opportunities, whether these are the right opportunities or not. I think the persons responsible for that.... I think there was a comment to this effect: “I don't know what the problems are, but a process that should take two years in a business cycle can't take six”. The fundamental economics change so quickly that they can't do what they'd like to do.
The story also goes towards September 15 of 2008—in my story—where there was a meeting convened in New York of the leaders in the financial world to discuss the bankruptcy of Lehman Brothers. When they asked these leaders of business what their exposure to Lehman Brothers was, nobody knew. Once again, it was a great crisis in data.
What you need to know is that to manage, one needs to measure. To measure, you need to audit. To audit, you need data. In today's day and age, to know what's happening and what is going to happen, you need to know what is happening now. You need real-time data. The EDM Council, the Enterprise Data Management Council, which grew out of efforts to teach the finance industry how to do this, described a holy trinity of data management.
You need unique and precise identification of things. You need unified views of meaning across organizations, locations, linkages, and interconnections, and the procedure is actually the reverse of what you might think. You've got to start with what your business practices are, what you are trying to do, and work back, reverse-engineer, to what you need to do that, what the critical data elements are that are necessary to do these processes. Then once you've identified those critical data elements, you need to clearly and uniquely identify them, the taxonomy and the ontology, so you actually can work with them and everybody's working with the same understanding.
You need to establish a unified view across organizations.
I'd like to just note that many of us are dealing with something called Phoenix. Most people see Phoenix as a data processing problem, and it is in fact not a data processing problem. It's a data meaning problem. They started with the process, and then tried to make.... I guess the analogy might be that they started with the person and tried to make the pants fit the person.
We are now in a situation where we don't have the data we need, but we also don't have a common understanding. Further with the focus on modern data storage, large storage models, we all know the concept of the data lake. Really a lot of the discussions I heard when I was first sitting here, half an hour ago, were about structured data, and that's the least of our problems. In fact, our biggest problem may be that we're waiting until this data is structured before we actually take it and use it and apply it. It's too late then. It has already got the data tax applied, and the data tax is one of the things that you guys are discussing. You can't get the data fast enough to make meaningful decisions. You lose the opportunity whilst you're waiting for something to happen.
There's another thing that we're lacking. I spent the last two years in a program at Columbia University, a masters in applied analytics, and I went there because this was a program that was focusing on the leadership skills you need to do data processes. I was one of the people who worked on Lac-Mégantic, one of the investigators, and when we came through that process, there were 18 causal factors to the accident. As a person standing back, an individual, as I sit before you, what I saw was 18 opportunities to intervene that were missed, and when you looked at it, there was more than enough data. There was tons of data, but there wasn't data being prepared and provided in a timely manner and analyzed in such a way that you could take action to prevent the catastrophe.
Interestingly enough, the data that is currently out there in the public venue is the data mostly from media, and media gets involved when it's newsworthy. So the focus in the resource sector is typically on the low-frequency catastrophic events, which have horrific results. We're failing on two levels, one because we're not getting the data to prevent them or mitigate them, and on the other side because we're not sharing the information of what is happening.
One of the things that I've been trying to push forward is that we can't just say.... This institute is important, but you can't just think of it in isolation. Data cannot function in isolation. We have lots of silos in government where we've collected wonderful information, and I sometimes call it hoarding. I come from a family of great hoarders, so I understand a little bit about it. The reality is that we collect an enormous amount of data—and this is not unique to government. Data is dirty, it's messy, it's hard to work with, it's frustrating, it's inconsistent, and we don't want to give it out until we know we've got it right, and that's not how it works in today's day and age.
We need to get that data out of its silos. We need to get it into a process where we can actually access and use it, and the data lake and the modern data principles don't care what format it's in, as long as we've identified what it is and we know where to get it, and we'll process it when we use it, right? I call that schema on read.
What the program at Columbia was designed to do was to create a—
I was on the board of directors of the Ontario Power Authority for many years, so I might be able to answer that question.
I would disagree that we don't have data on that. There is a lot of data on that. The question is, who has access to that data and who uses it in the right way? Groups like the Independent Electricity System Operator in Ontario have all kinds of data. They have billions and billions of data points on everything in the system.
To comment on the last question and the idea of creating some kind of national organization, it would be some kind of hybrid, getting StatsCan involved because they have the legal ability to compel information, but also working with independent experts so that it's quasi-independent.
The issue is that we don't have a common set of, say, how renewable energy would be looked at in Alberta versus Ontario. We don't have a common understanding of how the systems, pricing, and subsidies are different. If we had some kind of centralized body in Canada, we could look at common information. How does Quebec do this? How does Alberta do this? How does Ontario do this? Right now we have very independent systems, with people using different definitions, measurements, and policy tools. That's the benefit of trying to get all this together under one roof, but that's also why it needs to have people from across the jurisdictions in this country, because we're a big, complex country, with a lot of autonomy within the provinces.
One of the real keys—I didn't get a lot of time to talk about it—is the Internet of things technology, the industrial Internet of things.
I'll give you a quick example. I did an investigation a couple of years ago. A runaway crude oil car collided with a standing train and derailed a bunch of cars. We went to look at the car. There was this device on top of one of the tank cars. I thought to myself, “I wonder what that is.” Somebody used to call me “Inspector Clouseau” and I'd say, “I'm not Clouseau, I'm Columbo: I have one more question.” I went to the source and found out what the tool did. I found out that it took GPS. It took G-forces in three dimensions. It gave the peak impact duration as well, and the time. We got data that we couldn't have gotten otherwise.
When we put that data into the system, what we found out was that the tank car had experienced a force four times the design capacity of the car. Then the obvious question becomes, how come this thing didn't make the six o'clock news? Well, again, because we got this data, we found the telemetry data: the impact duration was only five one-hundredths of a second. You had this massive impact on the car, and then it released before it had fully crumpled. It had actually buckled. There was a small buckle in the car, because essentially the train had hit with a crumple zone and the forces were transmitted through once they got the locomotives moving.
The point is, why aren't we accessing that data? Ask yourselves. Do we say that obviously this is way too expensive to get? These sensors on a car cost $150 a year. It's chicken feed. Why that's not a requirement on every tank car, I don't know. On the comparison between pipeline and rail, for instance, I don't think there's anything inherently that says—I investigate both—one is safer than the other, except that the pipeline has sensors in every foot of it and is being constantly monitored. But for that tanker train, it leaves the station and gets to the other end, and there are few steps along the way where they have a look at it. The reality is that you could be watching it in the same way, and one tank car derailing is a lot different from having 30, 40, or 50 derail.