Unknown Speaker 0:04
So the first question I have on the slide is how Google Cloud can help you in the digital era, and the reason is, when I looked at
Unknown Speaker 0:15
Google, I was like, why Google in manufacturing? I had the same reaction when Google called me and I was at Siemens, and why would I join Google? When, when we are in manufacturing, we never hear about them, you know? And then I started reflecting, and like Amy mentioned, I used to be a scientist. I work in the food and beverage industry. I worked in the medical and pharma industry, and what I was realizing is that everything was linking back to knowledge, because every time you have a problem, you're looking for some information to figure out an answer or designing a new experiment, and then if it fails, you repeat this. So what do you do? You usually like? When I started 30 years ago, I went to the library, and then what do you know Google for is we organize the world's information make it easily accessible and useful. So when you think about what we're doing with our personal life and use Google search or YouTube map, it's doing exactly that. Now when you apply those same tool to your enterprise data, that's when it starts making sense, and I'm going to go and explain how we do it, because this morning, Walker mentioned in his COVID
Unknown Speaker 1:32
keynote why he likes Google. I think he mentioned it was the cost and how easy it is to use and retrieve your data and correlate the data.
Unknown Speaker 1:44
So how do we do this? We do this because we are focusing on data and AI. So as opposed to other hyperscaler, we started with our strengths, which is data and AI, as you can see, being the foundation. And then we're a modern infrastructure cloud, meaning that we're not just doing lift and shift. We're doing lift and modernize your application, whatever application you're going to run in the cloud, they're going to gain agility and speed. And I'll explain why we allow you to collaborate and make sure that there is cross functional group that can access the data that they need and
Unknown Speaker 2:18
gain speed in terms of retrieving the answer that they need to their questions. And then we'll secure cloud. The reason for all of this secure and more performance because we have a private network. It's a very different than another other hyperscaler. They use public infrastructure so that, you know, is creating room for risk of cyber, cyber security attack, or ransomware attack. Also, the performance of the network is very different when you have a private network, versus data that flowing through multiple infrastructure.
Unknown Speaker 2:54
Also, it's more sustainable because we designed the chip we own, the cable we own, like I said, the data center and the network and all of this runs with 60% less energy.
Unknown Speaker 3:06
So because it's in the spirit of COVID, we're not here to show slides, but what I'm going to tell you is that we build the Innovation Hub, or a smart factory in Chicago, that's where I'm based, and we can showcase to you all of these area that we focus on, which is the future factory, you'll see Dave later taking what
Unknown Speaker 3:28
Walker was mentioning earlier this morning and giving you access to the tool and the technology and showcasing to you how we do this. The reason we focus on the future factory is to help this manufacturing company
Unknown Speaker 3:43
move on from all the POC that they've been doing and deploy a solution at scale.
Unknown Speaker 3:50
But then we don't stop at the factory. We then correlate all the data to your supply chain, and then, you know, in a secure manner, and we focus on capturing all these manual steps that your workers are taking and all the system that they touch, and we automate this by leveraging AI and and Agent AI and then so it allows your workforce to upskill and focus on what they are really good at, which is critical thinking. And instead of doing manual tasks, so what we've done is we build a kiosk showcase within MXD. MXD, you might not know it, but it's part of the manufacturing USA program, which is an institute focusing on manufacturing times digital. And all of the solution that we have are live in demo. So in the spirit of prove it, it's exactly what you want to see. So we've had, last year, 10,000 visitors. We've had 40 private customer visits. And here's a quick video that I'm going to play for you to understand. You know what you can expect if you decide to come and.
Unknown Speaker 5:00
Do a follow up from COVID.
Unknown Speaker 5:02
If the video can play You.
Unknown Speaker 6:22
So in that video, what you can see is there's various kiosks, but within all these kiosks, we use Siemens hardware or Rockwell or ABB to capture data from the equipment, and they deliver insight of what's happening on the shop floor. We also have multiple Gen AI demo, or visual intelligence demo. We have what we'll talk later about this cortex framework, where we link what's happening to your supply chain. In your supply chain to what's happening in the factory to explain maybe a quality decreasing, or we understand that there is a customer complaint on a product, and we trace it back from a CRM system to SAP to what's happening on the shop floor, and Fabian, I think one really interesting point about MXD is the level of partnerships that we've had across organizations building on top of MDE, building on top of core Google cloud capabilities for visual inspection. So just like Walker was talking about earlier, I think partners is very key to delivering an end to end solution. That's a great point. So by the way, partner, that's usually when I start this presentation. I talk about you and as a manufacturing company, and then I refer to a book called The seventh perspective of effective leaders by Daniel arkavi. The reason is because those seven perspective. One of them is partner, and it is key to transform because you can understand your current situation and the vision that you have, you can maybe understand and define those steps that you need to take. But when you invite a partner, they're going to work with your customer and your employees to help you challenge or reflect on those steps and then help you design a better approach. So what I was
Unknown Speaker 8:06
glad to hear this morning, when Walker really mentioned the importance of partners, and partners doesn't mean just Google Cloud. It could be like 20 partners. And the reason we do it ourselves, there are things that we are not good at. So for example, we'll show you how we help capture data from equipment that is not something we focus on. We focus on what can we make sense of the data and how can we correlate those data points? So if you think about customer experience, those are the pillar we're going to focus on helping you meet your customer expectation by helping you iterate and design new product and innovate meeting your customer expectation means that we can also share what your customers are looking for, because we understand trends and what people are looking for, so we can make a correlation outside of your enterprise data. And for example, like I said, if there's a strike or news or a flood somewhere. Is it going to impact your supply chain? You want to make sure that all of these can be searched correlated. And how is it going to do, what if scenario to your business process? Then of course, we're going to focus on manufacturing operations. This is what Dave is going to go in detail about. And like I said, the supply chain. So how do we do this? We do it by leveraging the exact same tool that you don't know are powering Google search or YouTube or map. These are the same tool that you can then apply to your enterprise data. And then, when you think about Google, you know, when I started 30 years ago, I used to sell a knowledge platform that was competing with Google, but it was easier because it was focusing on trusted data, trusted data that was published in paper in company like books like Elsevier and Wrigley and Springer and you know, you want trusted data when you're an engineer and scientist.
Unknown Speaker 10:00
So what we do here is we focus on your data, but we're not just looking at text or we're not just looking at
Unknown Speaker 10:07
charts, maybe drawings. We're looking at audio, video. So as you can use those multi model capability that AI brings. It allows you to leverage best practice from somebody recording or they work at a workstation, and it makes a huge difference, because now all of these systems that are listed at the bottom, your PLM system, your ma system, they are all siloed. They are great system of records. The problem with these, they are very rigid. So what you want? You want a solution that is able to go, retrieve the data, access it, and make it, you know, useful to your employee. So this is what we do, and then we do this by leveraging a lot of solution. AI is one of them.
Unknown Speaker 10:52
And we're going to go in detail about what Walker would have. Will show you on first day, is how he's able to capture data from the equipment at the factory that he mentioned this morning, we're able to store and process it and then deliver it through application insight to the users, whether they're focused on energy optimization, like you heard this morning, the other solution provider, or if it's detecting an anomaly On a vibration sound of one of your machine. All of these insights can be captured out of the data, and we can make it shareable to a new employee, somebody who's not been there for 30 years that doesn't know how to recognize the sound of the machine, vibration that is maybe going wrong. So we can deliver this insight, so you can see that what he's doing is like, eliminate the need of searching for Dave, because Dave knows he's been here for a long time, and he can explain to me what to do. We can deliver these on smart watch so that the PLC talk to you directly and send, not just send you an alert there's gonna problem, but the work instruction and all the steps. So you can come see this academic MXD where we have this live.
Unknown Speaker 12:04
The goal is you can see the previous slide was talking about the manufacturing data engine that Amy mentioned, and you can see the use case on the right. But what I want to get to is that knowledge platform now you have access to a solution set that is pre built that's going to allow you to ask questions to your enterprise data, get an answer and a feedback, and they will show you that in a few minutes. So as you can see, the data source on the left, whether it's your PLC, again, we're an open platform. It doesn't matter what hardware you're using or software. You can have data residing on prem, on different hyperscaler, on GCP, we're going to still be able to retrieve that data and make sense of it, and then we're going to bring it and use those tools to deliver data and insight to the consumer. And one of the key components here is definitely, as Walker mentioned, all roads lead to BigQuery. That's going to unlock a lot of capabilities for you, to be able to make these correlations across these different enterprise data sets very easily and do that at scale. That's right. So again, like you see the same chart, just different schematic, where you see all the data source at the bottom and the data is flowing towards the user at the top we call innovation, where, in this case, we're focusing on precision manufacturing, but your marketing team, your sales team, or procurement team, might want to ask questions to your enterprise data. And you know the fact that we're connecting all these siloed data, it's one of my customers always said it's ubiquitous augmentation of what you already have. So you already have it. You're already doing it well. It takes instead of 30 steps or three weeks, you might get the answer in three seconds and one step. Okay, so here's where we focus on helping our customer, at the innovation product development level, at the R and D level. Again, it can be, you can be a chemical oil and gas engineer. We can look at the equations, transfer that into text, or from text to equations, and help you design new new molecule. It can be new car iteration, leveraging doing simulation on HPC, on GCP that allows you to do a lot of iteration on design and then move into the manufacturing supply chain, and then always helping you reduce your carbon footprint. Because, you know, this is one of the growth so thank you again for the few minutes here, I'm going to pass it to Dave, and he's going to deep dive into the application. Perfect. Thank you, Fabian. If we can switch the slides over,
Unknown Speaker 14:53
can we switch the slides to the other screen, to the demo? Thank you. Thank you. So first before we jump into the demo and I.
Unknown Speaker 15:00
Think I've got some fun demos for you, hopefully to kind of PEPP after lunch here, but wanted to just take a quick minute here and focus in on what MDE is, right? We talked about at a very high level, but effectively, it is a prescriptive approach to be able to capture and store and then analyze all of our machine tag data at scale, right? So this is not an out of the box, black box, SAS solution, path solution, it's really a stitching together a variety of different services within Google Cloud, which you've got full custom control over. So if there's different components that we built into it that you prefer not to use, right? It's open. You can eliminate those. You can add in different components, and so it's very modular, but the goal here is, at the end of the day, to be able to store and stream in large volumes of data at very low latencies. So think about when we're trying to capture hundreds of 1000s of tags across an enterprise, across many different sites. This is what MDU is built to do, okay? And you can, of course, do this on your own. If you want to build it from scratch, that's perfectly fine. But I get that question all the time from customers that, why wouldn't I just build this myself? Well, why would you when the solutions team at Google has already built this end to end for you, and there's no additional cost associated with MBE besides the underlying consumption of the services that you're using. So it's really, it's a go fast button to be able to get better time, better value in a shorter amount of time. Eliminate that arduous process of building all this plumbing at scale. Focus on the higher level analytics, getting value out of your data. Okay, so that's really the main concept there around MBE so. And a good example would be, Dave is going to show you how he did this, leveraging the data from the virtual factory. And then on Thursday, I think Walker is going to show how he did it. But Dave did it in 10 hours. Walker probably did in a lot longer than this, because he did it himself. The nice thing is, started small, and you know what? What Dave is allowing you is like to scale fast. You know, in order to get rich your vision? Yep, exactly. And going back to the to the partnering aspects, obviously, Google is not necessarily invested heavily in the edge side of things from a manufacturing perspective. So that's where we partnered with litmus automation to be able to deliver manufacturing connect edge, which is what I'm showing on the screen right now. So this is basically our component in the MDE stack for the Connect and collect side of things. Litmus is going to be doing a great session tomorrow. Definitely recommend go checking them out as well. But at a real high level. Take you through how we connect it up via MCE to the MQTT broker. So here, within the device hub, we can come in and add any type of new device, and there's a wide variety of different out of the box connectors available for us to be able to work with. So when we want to go directly to the PLC CNCS, any of that type of equipment. We can do that here. We've got our connection into the MPT team broker for the Pruvit factory. And if I jump over to flows manager here, we've just got a very simple node red flow to be able to grab all the tags do a very lightweight transformations here so that MDE understands what data we're sending in. So there's kind of three bare minimums that we have when we're trying to send data into MDE, is that it's got a tag name, a timestamp and a value, right? So we can add in a wide variety of other different metadata locally at the edge here, so we've got variable metadata that we want to capture in the processes in our manufacturing floor. We can do that, and it's highly flexible, so every plant does not have to conform to a very strict, strict, rigid schema. We can be flexible. We can enable folks to capture different metadata that different plants may need for their own use cases, but we're still able to converge that. So we've got an enterprise view as it lands in Google Cloud.
Unknown Speaker 19:06
So last at the edge side here, so we go over to the integrations here from MCE. We can send this data out to a wide variety of different destinations. And if I scroll down here, here's our Google Cloud PubSub connector. So this is our entryway into MDE in the cloud is via PubSub. So if you're not familiar with PubSub, if you're ever heard of work with Kafka, MQTT is basically a service, Message Broker. Okay?
Unknown Speaker 19:33
Now, while we do love MCE and love litmus team
Unknown Speaker 19:39
again, MBE is modular. If you've got existing infrastructure, existing solutions for capturing the tag information, as long as that can flow it into PubSub hive MQ, variety of other folks here have connectors into PubSub and BigQuery, right? We can use that as well. So again, modular, flexible. As far as how you want to build this, and you can.
Unknown Speaker 20:00
Take data from ignition, for example.
Unknown Speaker 20:03
Yeah, absolutely.
Unknown Speaker 20:05
So now, as we're leaving the edge side and moving up to the cloud with our data here in the MBE console, this is where we can manage all of our tags, so I don't have to come in here and manually create all these different tags. This is basically feeding off of the mptt broker, so all the different partners that have been participating, out here in the conference have been adding things in, putting things back on the UNs that was automatically getting created and generated here within MDE, to land that data in BigQuery as well. So just as a quick example, we're taking a look at press 103, some of the OEE metrics here, if I open up this particular tag, we can take a look at our storage settings. So again, this is really designed to be as simple as possible to be able to get deployed, get going quickly, start making use of your data. And so within the MD platform, we can land this data in a variety of different locations. Again, as Walker mentioned, everything leads to BigQuery. That's simply our landing spot for a wide variety of use cases, whether it's descriptive analytics as far as what's going on in the plant right now, what's happened in the past, but it also can be our repository for building higher level analytics when we want to get into predictive maintenance and looking at past history and failure events. BigQuery can be that source of data for that. We've also got options like our Bigtable platform. Bigtable platform, to think about scenarios where you've got really, really low latency, very high frequency reads of data. Think of like a vibration sensor. If we've got a lot of those streaming data in Bigtable can be a good, good fit for that. BigQuery can handle a lot of streaming data, but Bigtable can handle a really significant amount, not more.
Unknown Speaker 21:45
And then lastly, as we kind of look at some of the other options down here, as far as sending data out to something like PubSub. So when you think about uns being able to feed a wide variety of different systems, right when we talk about all our plants feeding into MDE, could now basically act as our enterprise uns to feed out to other systems, so all of our tag information flowing in, other supporting data and metadata flowing in, and we can put that back on a PubSub subscription and topic for other systems to be able to consume from so highly flexible again, to work with this data, and none of The older data has to flow to the cloud too. A lot of things that can stay at the factory level. You can also leave and use it one of our Google distributed cloud appliance where we ship Google Cloud to your premise, so that you can run specific use case that needs less than 60 milliseconds of reaction time for you to take an action. So we can do hybrid environment, yeah, yeah. And one of the great things there is the integrations with our AI platforms, as you build some of those higher level analytics in the cloud, if you've got a solution that needs to have low latency in terms of decision making after we make a prediction, right? That's where we can take that model, deploy it at the edge on MCE, so that we've got that low latent communication as we're flowing data in making a prediction and then taking action, either automated or manual, right on the plant floor.
Unknown Speaker 23:15
So lastly, here within MDE, one of the things that I just wanted to show is our ability to enrich this tag stream with further metadata. So as I mentioned in Node red at the MCE edge right, we could obviously add in a variety of metadata from the operator, the operation that's happening on machine at the time, but we can also easily add in more of our slowly changing type dimension metadata here in the cloud. So this we can do highly efficiently, rather than putting this information into every single payload of the messages at the edge and then flowing that up, we can just enrich that data here in the cloud, because these things are rarely, if ever changing in terms of our enterprise, our site, our location. So another great way to be able to add further context to our information, to be able to utilize that for analytics.
Unknown Speaker 24:01
So now, as we jump over to the BigQuery pane here, all the data from the Pruvit factory is flowing in here, I can rerun this query if I know SQL, great. This is probably going to be a fairly familiar experience to be able to run a query like this if I don't know SQL, Python, but I still want to do analysis on my data. This is really where large language models, Gemini, in particular, from Google, is coming into play to help our users be able to do this analysis. So here at the bottom, it's a little hard to see, but I'm just asking, what's the average quality for each machine for the past per day for the last week. And so when I execute this, this is going out finding the table or view that I need to to be able to access that information.
Unknown Speaker 24:51
It is then formatting and let me scroll back up, because it's kind of building out quickly on me here.
Unknown Speaker 24:57
It's building out the natural language to see.
Unknown Speaker 25:00
Cool, so I didn't have to know how to write the SQL. I just asked the question, right? It built out the SQL for me, ran the query so I get my result sets, and then it also generate a visualization for me, so I could immediately see how these things are trending over time based on the question that I asked. All right, Dave, real quick. I just want to interject real quick, yeah, all right, so let's not OK. So this is totally unplanned. I was supposed to show you early. I was supposed to show you this morning how you would use Google Cloud services to start. The way the vast majority of people are going to use GCS is they're either going to write a Python script and they're going to stream directly into BigQuery after they set up the security permissions, or they're going to use Node red to do it, and then they were supposed to come in behind and show how you would actually do it at scale. But there is no way to overstate how big of a deal, what? So there should have been oohs and ahs when he did this. So there's no way to overstate how important of a feature, what he just showed not just using natural language to build queries, but what he's going to do later. So what you're seeing is the progression data becoming pulled, being collected, analyzed, process, processed and visualized, all in a common flow and using and he'll be he's going to show here in a couple of seconds, because I had a chance to see what he's going to do, he's going to show how you can use generative AI to interact with your data, not just to build queries, but how you can use natural language to interact with the data. And at scale, that's incredibly important. Like, if you want to ask a question like, if I my purview is a division, and my division makes up 40 plants, and each plant has an average of 1000 PLCs, and those 1000 PLCs feed 100 assets. How am I supposed to know at any given time where I should focus my efforts, or where my team should focus my efforts. I have two options. Option number one is to deterministically define the KPIs that are going to funnel me down to the assets or the groups or the areas that are most important. And option number two is to use artificial intelligence to come up with the KPIs in real time based on the questions I ask them, and then whittle that data down to what I should be focused on. So what was on what's on the screen here, is a big deal, and it's the reason why, at the end of the day, the what I show you Thursday morning is the way you're going to start, and then you're going to realize that isn't what Cloud is. Cloud is what they're showing you on the stage right now, what I'm going to show you on Thursday is how everyone starts. Because at the end of the day, all roads, whether you're a first grader, a seventh grader, a 12th grader or a post doctoral student, all roads lead to BigQuery. The question is, how do you get the data there? And there was another thing you quickly passed over that was just really, really important. The ability to add context to data as it is streaming through PubSub towards its destination is critically important, and Gregor did an excellent job at Hanover Messi highlighting the importance of being able to add text to a stream or context to a stream, so a series of values that are coalesced together and imagine something like dot SAP ID or dot license tag on onto the stream itself. So now what I've done is I've added context without joining the data together. The more digitally mature you become, the more the more complex the solutions you are building, the more important what they are showing you becomes. You will look back many of you, if you're in the if you're the average, the mean, I'm at the seventh grade level in digital transformation, then this looks kind of impressive, but I'm not sure how I'm going to use it. But if you've been doing this for five years, or for eight years, or you've gotten to the point where I really want to turn the wrench on Cloud. You're like, where the fuck has this been? Okay, that's what you're saying to yourself. Okay, so it is a very, very big deal. This component here, this is the money shot. It is, it is. And the fact that you can land this into project scope within Google, and what I talked about earlier, the project scope, scope and Google Cloud services, is so critical, because once the data is landed inside the frame, it's accessible by all those tools that they put on the screen without you having to pipe them. There's no pipe involved. It has access to the data. That is absolutely critical to shorten time to value. So anyway, there was no intention for me to get up here and do this, but because I moved my presentation to Thursday. I wanted to make sure I added that context, and thanks Walker for jumping in, because I was going to give him a hard time, but, but then I heard he priced 500 from like, Okay,
Unknown Speaker 29:53
thank you. Maybe not. Excellent job. Go ahead, Dave, thank you. I do have a question for you when you're done, though. Yeah.
Unknown Speaker 30:00
Yeah, awesome. How we going inside? OK, cool. So,
Unknown Speaker 30:06
kind of transitioning from a data analyst working in BigQuery that may want to kind of deep dive here and be able to do additional joins of other contextual data when we think about other users in the organization, more of our business users that will just want to see the kind of high level dashboards, right? That's where Looker studio comes into play, to be able to have that overlay on top of BigQuery for that more simplistic view, but still being able to very easily interact with the information. Another example, though, as far as kind of bringing experience from Gen AI to those users, is this ability to have a data agent and conversational analytics right here within Looker studio. So I built a very simple data agent here. I may need to refresh my screen, and that's right. So as you're doing this, it's all these applications that I was mentioning for the data to be consumed by your business user. So the first one was Looker show acacing Like a lot of dashboard that you need, whether it's in your factory or on your cell phone or smart watch. I mean, there's a lot of things you can do to get that deliver being delivered to you. And another way to interact is this agent, TKI, that we're deploying at our customer. And often the reaction from the customer is exactly what what Walker said, like, where was it for the last seven years? Absolutely. And so in this scenario, without having to pre build a dashboard for every single user's preference, every single variation of a visualization or KPI, right, a user can just ask for what they're interested in. And so in this case, maybe a manager is looking at just understanding what was my availability per machine yesterday. And so we'll go ahead and fire off that query. And again, we're doing that natural language to SQL process, because all this data is still sitting in BigQuery. It's streaming in as we speak. And so we're going to build that query, and then it's also going to visualize that information automatically for us. And as I scroll back up here, you'll see, if I have questions about how was this calculated, right? It's not just a black box. It's going to tell you exactly, here's exactly how I calculated this for you based on your natural language question. If you want to see the code, if you're a more advanced user, right, there you go. There's the code that it used to generate that. And then we've got our nice visualization at the bottom, as well as we can get natural language insights about what's happening in this this visualization, so we can get further insights automatically. Yes, again, the trusted source, the grounded, you know, yes, you're getting insight information. Where is it coming from? You know, you want to validate this. That's like I mentioned earlier, yeah. And now you know why I am so excited about Google Cloud, OK, the reason I advocate so heavily for it is because I can tell you, I've tried to do this in other environments, and you can there's a way to get there, but there's no way to get there in the amount of time that Dave invested in building this by piping our data 10 hours. And literally, a man, a man day
Unknown Speaker 33:09
of investment into this. I mean, I remember asked him, How long did it take you, Dave? He said, I squeezed it in between two clients. So to put it in perspective, okay, and I just want to ask one quick question, Dave, so on Thursday morning, I'm going to show how most people are getting started with Google Cloud, right? That isn't obviously the optimal path, but I'm going to show it so that I can relate to the audience. Hey, this is how the first time I played with GCS, I did this. I went node red, landed it into BigQuery, and then I made a build a looker studio dashboard. I didn't use Google edge, which is litmus edge, and I didn't use MDE, which is, you know, relatively new over the last couple of years. I didn't, I didn't leverage Google cortex framework. I didn't do any of that.
Unknown Speaker 33:53
So obviously, you work in solutions for Google. You're working with the client. What you see that people are doing it the way I'm going to show Thursday morning, what is their journey look like when they go from the way I'm going to show Thursday morning to this. How are you onboarding them and moving them to a more optimal deployment strategy like this? Yeah, I think it always comes down to kind of that technical skill level as a starting point, because, as you mentioned, we have to start somewhere, right? We can't just jump head first into the Uber scale solution, right? So I think it's upskilling on knowledge about why MDU was built the way it was. And then how do we transition as that scale is necessary, right? Because we can do a lot of these same things, and you can do a lot of these same things the same way you're landing data in BigQuery, right? We can expose this and leverage these generative AI capabilities right on top of it, so it's identifying when that scale is needed, and then upscaling their teams internally. Yeah, but that's a good question. So one of our customers who knew in France, the automaker.
Unknown Speaker 35:00
You know, they did this way before MD existed. They started in 2019 four years later. I mean, they save a billion dollars every year, but it took them time, because they took the tool that existed and they did it themselves, like you did at a big sale, 39 plants. Well, you know, now, you know, automaker, you can deploy that within weeks. So that's a big difference now that it's available. Yeah, absolutely awesome. So we'll keep moving here. I just have a few other quick demos wanted to share with everybody. So as we start thinking about MD, is that platform right? Building descriptive analytics, like we've shown thus far, how can we now start transitioning to more predictive analytics. Oftentimes the challenge there is, do we have good quality data to be able to train a model on right so one of the ways that we advocate to get started quickly where you may not necessarily have a lot of good quality data, where you've got say labels of this is when this type of failure on the machine happened, or this was the leading telemetry before a failure occurred. One of the things we can do is leverage what's called our time series insight API within Google Cloud that allows us to we can send that data from MDE out to that API and have it learn what the pattern is for that particular tag string. So I'll go ahead and fire off this example here of just creating an anomaly of vibrations within the drive shaft of a shot blaster piece of equipment. And I'll come over to this report and we'll talk through kind of what's happening here. So MDS feeding. We're feeding in the deep phase, current RPMs as raw tag information here, and then those are going out to the time series inside API so it learns the pattern. It's not just setting this is my high water mark. This is my low water mark of what the operational bounds are for this metric. It's learning from the past history of how does this tag actually typically look. And then, based on that, once it's built up enough information, usually a couple days worth of normal pattern, right, then we can start calling out anomalies, right? And that's what's happening here. Is that in this visual here, we've got anomalies being predicted by that time series inside API, and that's also being fed back into MDE, so now we've got a persistent record of when we've got anomalies, we can go back to repair information or service tickets for machinery to understand, was there actual failure, what error codes were thrown when these anomalies were leading up to so then we now have a buildup of a repository of information we can build A more advanced predictive model off of because now we've got good quality data of leading telemetry and then some type of failure event. So
Unknown Speaker 37:47
this is just a great way to get started.
Unknown Speaker 37:52
Now, lastly, we wanted to showcase some of the things we're doing, kind of thinking outside of MDE within the Google Cloud Platform, talking to Walker about about this and seeing if this would resonate with the group he mentioned, meantime, to repair is a huge factor for many, many organizations. So one of the things we've done at MXD is taken a lot of the technical documentation for one of the robots that we've got there, an ABB robot arm, and we've taken all that technical documentation, 800 page plus, type very detailed documents, and put it into an agent builder, search agent. So what I can do now is just ask a natural language question about the ABB Robot. In this case, if I'm wondering, how can I replace the turning disk on this robot arm? So agent builder here, this is basically a search agent is allowing me to not only do keyword search to get the exact match from those documents, but I'm also getting a nice summary here and a breakdown of generative steps. So it's first going to help me understand, okay, there's any safety danger warnings to be aware of. It's also linking directly to the document that's holding that from that's grounding this information into. And then it's going to take me through the nice, easy to follow steps so that operator can feel confident they're following the correct steps. They're remaining safe. They're going to be able to repair that piece of equipment much more quickly than if they had to comb through an 800 page document to find the answers. And one of the things that I really love about this agent capability is the fact that a lot of plants are going to be worldwide, so we've also got natural language translation built right in, so I can ask the exact same question now in Spanish. We're not necessarily translating the entire documents itself, but their generative responses that we're getting back respond in kind based on the input language. So if I come over here to our settings, you can actually see all the different languages that we support, and this is continually growing, but for an international.
Unknown Speaker 40:00
Organization, right? This could be incredibly valuable to be able to build once and be able to take to multiple plants, no matter where they are around the world, yes. So for example, the CIO of Gu appliance, they have a huge plant he came to MXD, they have a huge plant in Kentucky, where they have 33 native language in the shop floor. So leveraging this type of solution helps them. But you can also, you know, capture your audio and video of somebody working at the other station, or taking apart a dishwasher. And then you can then query the video, and then you get the YouTube short version of it, you know, and then tell you exactly what to do at what time, so you can train your people faster, or in the first place, capture the tribal knowledge of the workers, people who have been there for 40 years. They know how to use this equipment more before they retire. You can then capture this, make it available in any language by querying video or audio. Yeah, and that's a fantastic point that I always forget to hit on, is that in this demo, we just use the technical documentation from the manufacturer, but you've got so much tribal knowledge built up that we could also add into the context here, so that we've got more nuanced recommendations specific to how your plant operates and how those experienced operators execute tasks. And it's not just receiving an alert of something's going wrong, it's telling you already what to do next to fix the issue or avoid an issue. That's why we have industrial smart watch at MXD in Chicago, where you receive the work instruction that Dave was showing you instead of having to look for it. Yeah. And if you're looking for a quick win for your plants. I really don't think there's anything quicker than this. I can step you through real quick how I built this. So in agent builder here within Google Cloud, we've got a variety of different options here in terms of our different agents we can build. And this one, I'm just creating a document search agent. I'm going to give it a quick name. We'll call it Demo. Give it a company name. We'll call it Pruitt.
Unknown Speaker 42:06
Click Continue, and then we simply create a data store. So here we've got connectors into cloud storage to where I can define a folder full of files where I've got my information here. So we've got this robot arm folder with some of that documentation continue, and we'll give it a
Unknown Speaker 42:27
data store name, and then we simply just define which type of parser we want to use. So if you've got old, older PDFs, events scanned in, OCR, parser works well. The layout parser is really good when we've got a lot of tables and structures in our documents, and so I've used that previously, and just like that, I can click Create absolutely no code, right? That gives me the same exact agent that I just showed you. And so I can come back to here, and we've got very simple natural language instructions that we defined for our agent.
Unknown Speaker 43:05
So if I come down here, it's a little hard to read, but basically just saying, You're helpful agents designed for industrial manufacturing operation users. When they ask you questions, ground your answers in the repository of documents that we've made available to you. And just like that, you can have that and expose that to your users, and you can also integrate that into applications. So if I come over here to my integrations pane, right, we've got example code here to be able to integrate that, take that right into your application, as well as into your basically making API calls from other applications. So incredibly easy to get started with very, very fast time to value for that particular solution.
Unknown Speaker 43:49
Awesome. You want to do like, one more video, so we'll speak to it, and you can speed it up. Basically, we talk to you about the cortex framework. So think about your cortex. Okay, Google is a big user of SAP and Salesforce and other PLM system mas,
Unknown Speaker 44:07
what we I said earlier was they're great system of record. They're not being replaced, but they need to gain agility and speed, because the system of record are rigid. So what we're showcasing in this video, and that you can see at MXD, is a full digital thread between all of this system where a customer is complaining about a product, you can use a case in Salesforce that is now going to give you access to the bill of process, the bill of material. When was it produced by WHO, WHAT equipment was in, which part was involved in manufacturing this bike. And then you're going to be able to go all the way down to a manufacturing data engine that's going to showcase that somebody was using the wrong tool. And just kind of like what similar Walker was explaining with Hector this morning. Same thing, you know, it could be an equipment that is failing because of vibration.
Unknown Speaker 45:00
Question. But then at that point, you can then recall all the other products that were affected exactly to the one the unit that were affected. You don't have to recall 100,000 vehicle you know. You can just focus on the 30 that were affected by this. So a full digital thread between system that you already have and that exists, and I mean, is going to come back? Yes, this is amazing. I hate to cut you off, but I want to make sure the audience can ask you guys some questions. So if we could just throw up the QR code again, really quick, in case anybody has any
Unknown Speaker 45:35
questions they want to submit. And remember, via the QR code, all you do is scan it, and then you can vote and decide which one is your favorite.
Unknown Speaker 45:49
Yeah, me neither
Unknown Speaker 45:52
expected as the first one, but
Unknown Speaker 45:55
yes,
Unknown Speaker 45:57
I'll take the first pass. I didn't feel the better answer, let me know. But read the question out loud. So the question is, Google is known for dropping services and support, or I cannot product maybe. Can you touch on Google coming on to solution for manufacturing? So yes. So first, I have to recognize that it happens. The reason it happens first is because we iterate and come up with product and see and come up with better solution. So like, for example, it might be referring to something called IoT Core two years ago. Okay, like Dave mentioned, there are things that we're really good at and things that we're not okay. So we're going to leverage partner. In the case of Walker, mentioning the importance of your partner ecosystem, this matters because some of the things we do better than anybody else, but others, we're going to have to rely on partner. So Dave mentioned litmus, or I have MQ this morning, like in his presentation, and that's where we're going to have to recognize maybe we don't do that better than somebody else. So now we are committed, obviously, to manufacturing. You can see the public company that are already leveraging us and speaking about what we are doing. Workers identify the advantage of using us in manufacturing setting. So we're here to serve you and just go back to being a servant leader. Google is here to help you and understand what you can do and apply the technology to solve some of your specific challenges. And one other thing I just add to that is we do have to draw a delineation between Google and Google Cloud. Yes, right? Thank you. Google is very much focused on the consumer side of the world, right? And that group basically looks at products in the billion user range, right? So, yes, I'm sure you've all seen, you know, killed by Google, right? That is primarily in the consumer space. Now are deprecations going to happen in the enterprise space with Google Cloud, of course, but we do have guarantees now in place, particularly after the IoT core process, to say we will have at least 18 months time frame before any type of defecation occurs. I had, can't recall. I haven't seen any since that one, so hopefully there won't be any. That's right,
Unknown Speaker 48:27
but there's definitely a delineation there between enterprise and consumer. Yeah, but it was a fair question, and thanks for asking. Awesome. All right, probably have time for maybe one or two more. So next question, Google Cloud, PubSub connector. What value does this provide over and above native nqtt, Kafka or ignition tags? Does it run an agent behind my firewall? So PubSub is not going to run locally at the edge, right? PubSub is a message queuing service that resides in Google Cloud, right? Whether you want, however you want, to get data to that end point. That's your entryway into MDE and PubSub. Google designed it, built it internally to move petabytes of data around Google data centers daily. So it's serverless. It's incredibly scalable, and so that is what we've used to build on, treating that scale without you having to manage the infrastructure of your MPT brokers, scaling those up or out in variety of different locations, right? So however you want to get the data to that, that is our scalable entryway to queue up the messages.
Unknown Speaker 49:38
Okay, I have to ask this question, because I'm the MC, how much does this cost? Yeah, so when we think about all of Google Cloud services, it's all consumption based, right? So there's some variability in there. MDE, though, from a baseline infrastructure standpoint, generally, is going to be about $2,500 a month. So.
Unknown Speaker 50:00
From a consumption perspective, right? But you do have to take into account and think about, what are the higher level analytics you're going to build on top of that, right? As you start training models, yes, there's going to be more consumption, right? So think about that as just the base platform. Now, when we think about something like agent builder, like I was showing at the tail end of the demo there, right? There, again, it's consumption based. So it's going to be based on how many queries you're issuing to that model that you built. But you're talking about pennies to dollars per 1000 queries, right? You're not breaking the bank with Agent builder. Wonderful. So I'd like to take the next two questions, because they're very important. Does Google have a gov cloud, equivalent to i for ITAR and other sensitive data, yes. So we have a Google public sector that manages all of the
Unknown Speaker 50:50
government organization or aerospace defense
Unknown Speaker 50:54
customer, and we can do multiple things, yes. So we have a gov cloud, but we also have, like Google Cloud on prem, where we can deliver it in a ship or in the middle of the desert, and so we have a team that just takes care of this. The next question is, can you talk more about the data privacy without manufacturing data into Google Cloud ecosystem? So yes, it's your data, so it's in your own tenant, and maybe you can add to it, but we don't have access to it. We don't see any of this. The you own the encryption key if you decide to, the only thing that you might want to be interested is like, how can you correlate
Unknown Speaker 51:37
the public data, news, weather, traffic and consumer data to your data to give more insight to your employee. That's it. But it's totally different. Yeah, and going back to the security question here, everything that lands in your projects within Google Cloud is within your tenancy. No one within Google has access to that that's your data. No one has visibility over it. Hard stop if you do need to have higher level controls in terms of security, when you're thinking about FedRAMP, when you're thinking about il four, il five, those things can actually be enabled in any Google Cloud region through a service called assured workloads. So it's basically a software layer on top of that to be able to enable that, we don't have necessarily a siloed off limited region that's just for specific government entities. And if you need to get to something like il six, Top Secret, that's where our air gap solutions come into play. We actually deliver basically Google Cloud and appliance to those types of customers. And there's one more thing, we're the only sovereign cloud so France, Germany, if they want to have data that doesn't leave the country, we deliver this with our own data center, and it's available in multiple countries already. Awesome. I hate to cut this conversation short. I know we have a lot more questions, so please go find the Google Cloud booth to keep this conversation going, because we are at time, but this was incredible, really. Thank you guys so much. You.