MLOps Community
+00:00 GMT
Sign in or Join the community to continue

11 lessons learned from doing deployments // Sol Rashidi // DE4AI

Posted Sep 17, 2024 | Views 1.2K
Share
speaker
avatar
Sol Rashidi
CEO and Founder @ ExecutiveAI

With eight (8) patents granted, 21 filed, and received awards that include: "Top 100 AI People" 2023 "The Top 75 Innovators of 2023" "Top 65 Most Influential Women in 2023" "Forbes AI Maverick of the 21st Century" 2022 “Top 10 Global Women in AI & Data”, 2023 "Top AI 100 Award", 2023 “50 Most Powerful Women in Tech”, 2022 “Global 100 Power List” - 2021, 2022, 2023 “Top 20 CDOs Globally” - 2022 "Chief Analytics Officer of the Year" - 2022 "Isomer Innovators of the Year" - 2021, 2022, 2023 "Top 100 Innovators in Data & Analytics” - 2020, 2021, 2022, 2023 "Top 100 Women in Business" - 2022

Sol is an energetic business executive and a goal-oriented technologist, skilled at coupling her technical acumen with story-telling abilities to articulate business value with both startups and Fortune 100's who are leaning into data, AI, and technology as a competitive advantage while wanting to preserve the legacy in which they were founded upon. Sol has served as a C-Suite member across several Fortune 100 & Fortune 500 companies including:

Chief Analytics Officer - Estee Lauder Chief Data & Analytics Officer - Merck Pharmaceuticals EVP, Chief Data Officer - Sony Music Chief Data & AI Officer - Royal Caribbean Cruise Lines Sr. Partner leading the Digital & Innovation Practice- Ernsty & Young Partner leading Watson Go-To-Market & Commercialization - IBM

Sol now serves as the CEO of ExecutiveAI LLC. A company dedicated to democratizing Artificial Intelligence for Humanity and is considered an outstanding and influential business leader who is influencing the space traveling the world as a keynote speaker, and serving as the bridge between established Gen1.0 markets and those evolving into 4.0.

+ Read More
SUMMARY

With over 200+ POCs built and with nearly 40 products in production, Sol walks us through the journey of developing AI products at scale and the 11 Lessons Learned in the journey - spoiler alert, only 30% of the challenges are tech related, 70% are non-tech issues!

+ Read More
TRANSCRIPT

Demetrios [00:00:03]: I am going to bring on our first keynote of the day. Let's see if we can get Sol up here. Hey, there she is. What's going on, sol?

Sol Rashidi [00:00:13]: Good morning or afternoon? Good evening. All of the above. It's 06:00 a.m. in California, so I think we're hitting on all time zones.

Demetrios [00:00:22]: Exactly. Well, I know that you squeezed this in, and I'm so thankful for you to do this because you're in between transit and you're making the time for us, which is huge.

Sol Rashidi [00:00:35]: I love this community. This is how I grew up. So my fellow brothers and sisters always, always, always make fun.

Demetrios [00:00:41]: There we go. Awesome. Well, I know you have a screen that I'm going to share right now, so if you want to navigate to your slides, then I will throw it up on the screen. And in the meantime, anyone that is out there, feel free to grab one of these good old hallucinate more than chachibi t shirts. We're having fun.

Sol Rashidi [00:01:06]: Yeah. And I have one, and I absolutely love that aqua marine color that you have, the light one. And I always get comments on it, so it's definitely a fun shirt. And the fabric is great. That makes a difference.

Demetrios [00:01:17]: There we go. There we go. So I did not pay you to say that. Just so everybody knows. Right on. Well, here we go. I'm throwing your screen up onto the stage and should be able to. Oh, there we are again.

Sol Rashidi [00:01:36]: All right. Can you. Can you see the eleven learnings?

Demetrios [00:01:39]: No. What we see is us and the, like, the infinite.

Sol Rashidi [00:01:46]: All right, let's try this again.

Demetrios [00:01:48]: Yeah, yeah. Here, if you want to. This is the fun part.

Sol Rashidi [00:01:53]: All right, how about this?

Demetrios [00:01:55]: Now we see it. Awesome. You should be able to see it, too.

Sol Rashidi [00:01:57]: Okay.

Demetrios [00:01:58]: Let me give you a little more space by getting this QR code off of the stage, and I'm going to hand it over to you. I'll be back in like 25 minutes.

Sol Rashidi [00:02:08]: Yeah, perfect. And if there are any questions, images that come along the way, just let me know so that at the end we can get some q and a going, too. I don't want it to be a monologue. I do definitely prefer the dialogue. Right on. So if you can help me out with some of those comments and questions, I would love and appreciate that.

Demetrios [00:02:24]: Yeah, yeah. I'll jump in when people ask them in the chat and I'll propose them to you. All right. See you soon.

Sol Rashidi [00:02:30]: Perfect. All right. Well, good morning. Good afternoon. Good evening, everyone. I really, really appreciate everyone coming to this webinar keynote, Demetrios and team are just awesome. And every time I come to an event or I'm speaking at event, it just blows my mind of how much coordination organization they're able to put into it. And there are just a lot of smart people speaking today, whether they're keynotes or panels or whatever it may be.

Sol Rashidi [00:02:55]: So hopefully, you know, you guys can carve out some time and you take a few nuggets home just to kind of open us up, because it's 06:00 a.m. so why not? Where I am at, I have this fun little joke of what do AI and teenage sex have in common? I know it's a bit provocative, it's a bit progressive, but bear with me. I mean, at the end of the day, everyone talks about it, nobody knows how to do it, everyone thinks everyone else is doing it, so everyone claims they're doing it. But in reality, in the world that I've come from, and I'll give you a little bit of background and context over the course of time. The fact of the matter is that while there's so much information out there and there's so many use cases, there's a lot of great and fun enterprises who are out there saying that we're doing this. We're doing this. Only about 40% of the ecosystem has actually even started doing something. And I'm not even talking about pocs, I'm not even talking about pushing things into production.

Sol Rashidi [00:03:52]: This 40% includes the discussions at the board, kicking the tires, even thinking about use cases, or bringing someone in like a management consulting firm, to kind of help them walk through what is that art of possible. So even though it's a reality for us, and some of us have been living at hardcore for the past two years, some of us have been in it for over a decade and a half, the fact is, it's still relatively new to a lot of industries and a lot of functions. Now, just by way of introduction, some of you guys may know me, but most of you guys probably don't. I kind of came out of the woodworks. I grew up playing rugby on the women's national team. I thought I was going to be a professional athlete for the rest of my life. And then I started getting older. I started getting injured.

Sol Rashidi [00:04:40]: I wasn't recovering as fast. So from there, I took the first job offer that gave my way, and I actually became a data engineer. About six months into the gig, my teammates approached me and they're like, listen, sol, we love you, but you're never allowed to touch a wicked code again, because I didn't know how to write production grade code and that was the reality of it. But I was a bit of a chatty cathy. And so they're like, but you like talking to the business all the time and you tend to network and that's not something we prefer to do. So why don't you be that, go between you go talk to the business, gather their requirements, figure out what it is that they want to do, and they come back and translate everything. So I wrote functional spec, tech specs. Sometimes I even push back on the business and I was like, is that really what you're asking for? Because here's what I'm hearing, but here's what you're saying.

Sol Rashidi [00:05:24]: Long story short, what I originally thought was like, rejection, and then it turned into just being this glorified translator actually turned out to be my superpower, and that is to be tech and functional. I can sit within the business and understand what it is that they're actually trying to do and then I can actually convert that. And yeah, it's been a fun ride. I've had some amazing jobs, but I've been in the data space since the late nineties. I know I totally just aged myself, but I helped IBM watch Watson in 2011. My job in 2011 was to fly around the world, work with these companies that wanted to talk about Watson and AI, help them establish strategies, establish use cases, and once we landed on a use case, fly back to Arma and Austin and work with the data engineers, work with the ML teams, work with the product managers and actually develop the product and the platform. So I was forward selling a lot of the capabilities and then we would backtrack and try and build some and catch up. And then things just kind of progressed.

Sol Rashidi [00:06:25]: We built my teams and I some cool things for different industries like travel and leisure, music, municipality. And I've had the opportunity of being CDO Cao, chief data and AI officer four times over. So kind of a veteran in the space, and to quote my father, same sand, different shovel. So they've all struggled with the same things, even though it was very, very different industries. And then about a year ago, took some time off in enterprise. I said, no mas, I'm burnt out. Want to pursue other passions? Wrote the book your AI survival guide, which has been amazing. Got a chance to speak, work with startups, fell in love with startups.

Sol Rashidi [00:07:03]: High caliber, high velocity, frontline, doing actual real things for folks. But oftentimes they lack that integration into enterprise. They didn't know how to translate what they were doing into an enterprise playbook or organizational structures of enterprise. And so I've been playing that role. And then today marks my 49th day. I actually took an offer with Amazon as their head of technology for North America for the startups division. So I'm back in the belly of the beast now with that said, I know everyone knows this, so apologies if it's repeat, but yes, all industries are being impacted, even those that are historically considered to be non tech savvy. So if we take a look at education, construction, agriculture, basic materials, real estate, everything's being discussed, but most functions, there's still a bit of an adoption curve that's starting to come to light.

Sol Rashidi [00:07:57]: But marketing and sales, as we all know, that's the number one use case that tends to come up, because that work tends to be very, very laborious and manual in nature. Customer operations, that's a classic one. R and D, software engineering. So it's interesting because every corner you turn, every conference you go, heck, we've even got LinkedIn experts in AI, which I think is a very, very funny term, especially if you've only been doing it for a year and a half or two. I've been in it for over a decade and a half and I can barely keep up with what's happening. I don't think anyone's an expert per se, but there's a lot of hype and there's a lot of reality. But the just some fun facts, just to kind of plant some seeds. By 2030, 2030, they estimate that the global investments in AI is going to be at $1.85 trillion.

Sol Rashidi [00:08:48]: Now, I'm going to share an unfortunate thought, but do you know what the cost is to solve world hunger? It's less than half of that. So in the next few years, and what we've already invested, there's going to be more investments in the capabilities of artificial intelligence that exceed the amount that it would actually cost to solve world hunger globally for us, right? So there's that much money pouring into it. The highest adoption rates right now by nation, India, Singapore, United Emirates and China. But here's a fun fact. Take a look at where us is. We're still kind of, even with all the hype and all the conversations and all the conferences and all the enterprises and all the marketing, we have the lowest adoption rate. And part of it is there's a lot of activity around exploration, but the differentiation between exploration versus deployment, that differential is actually quite large for where we are and for a lot of the other countries. So the first thing is why all the challenges? I don't get it, right.

Sol Rashidi [00:09:53]: It's obvious in our world. It's so obvious of like, the amazing things that these capabilities can provide to. And I grew up in the data space, right? And everything at the end of the day still boils down to data. That's my heart, that's my love, that's my passion. And I do view AI as a product of data. But for things that are so obvious, why do we still have challenges with option rates? Well, I can only share that. I've done over 200 pocs, and I have 39 products that are in production that are enterprise grade with security. So not everything I've developed, build or deployed with my team has made it actually into production.

Sol Rashidi [00:10:32]: That's just the reality of it. You can't get attached, you know, to every person you mentor or every project you groom and you give your blood, sweat and tears to. But when I've gone back and I've done these post mortems, and I've taken a look at what's happened, only 30% of our hurdles were really technically oriented. Infrastructure, workload, products, training models, data security. 70% of all the deployments and experience that I've had is actually dealing with non technical friction. There's just a differentiation with expectations, fear and resistance from just individuals who, quite frankly, are worried about job loss, workforce maturity. They're just not there yet. Some can't even think big enough.

Sol Rashidi [00:11:16]: Their imagination doesn't even go that large to understand that this could be a reality for them. Talent, they just don't have the right internal folks. They have to bring in consultants and external, and that's fine. But half the time they don't even know how to vet who knows what, because everyone tends to shove a bunch of terminology into their cds. And then the hero syndrome. You've got teens who always want to work on the latest and greatest and want to be innovative, but they have absolutely no clue what they're doing. But they love being the heroes of the story without even having the expertise, expertise or the wherewithal to even understand how to actually deploy it. So most of the stuff is non technical.

Sol Rashidi [00:11:54]: And on top of that, you add that, yes, disruptions are hard to begin with, and you guys already know this, but just to make sure it really hits home, the pace of change is fast, so it's impossible to keep up. I mean, if someone asks me, hey, what's the modern tech stack? I'm like, well, it depends. And there's so many variabilities that go into that conversation. Used to be an easy conversation. Isn't anymore. Yeah, there's resistance just because new terms, new advents, new trends. Stuff is scary at first, you know, the cell phone took about 23 years for adoption. The refrigerator took 47 years, the Internet took 27 years.

Sol Rashidi [00:12:30]: Things just take time, I think. Lack of foresight. There are so many things that hit us every two years. The new buzzword, whether it's nfts, crypto, blockchain, web 3.0 metaphor. There's big data, AI, you know, folks who don't live in our space, they're being bombarded constantly, and by the time they finally get comfortable with something, we've got something else to introduce. And then organizational inertia. There's all this amazing, like, inspirationally, this is what we want to do, but aspirationally, this is how much we're willing to invest in. There's a big difference between perspiration and aspiration.

Sol Rashidi [00:13:07]: Everyone wants to be innovative and be on the bandwagon, but not every company is willing to put in the necessary resources to actually back up what they're saying. So, long story short, if I were to aggregate 14 years of deployment experience for artificial intelligence capabilities, with Watson being the first take and us doing predominantly b, two b work, and then now pivoting to b two c with just stuff that's happened with 200 plus pocs, now the 39 applications in production, here's how I would summarize that. First is strategy. Strategy doesn't align with their internal maturity. Oftentimes you get these management consulting firms coming in, and no knock to them. I was one of them. And you get the executives in place. The executives aren't aligned necessarily with the folks on the field and the practitioners and the reality of what's actually possible.

Sol Rashidi [00:14:02]: So you have this amazing deck with this amazing strategy, with foundational principles, key values, how we're going to move forward, high in the sky approaches. But the reality is, actually very few of it is deployable because they just don't have their finger on the pulse of the actual maturity of what they can do with their infrastructure and what they can do with their data. And so oftentimes you get these use cases that folks want to explore, and more than half of them, 75% of them, aren't even doable because they just don't have the organizational maturity, they don't have the technical maturity, they don't have the data maturity. And so when I see these AI strategies and people bring me back and go, hey, can you help us with this? We're stuck. I'm like, okay, let me see your data strategy first. What's in place? What's in flight? Where are you aspirational getting? And let's map the two together. And it's like crickets. And so, strategy just doesn't align with maturity.

Sol Rashidi [00:14:54]: The second is the why. This sounds so silly. And if any of you guys have done your MBA or you read leadership books, it's always the why. Or if you're a fan of Simon Sinek, it's always the why. But when you translate the why into what we have to do day in and day out, I say, okay, we want to do AI, but why? Is it because the board said so? Is there a real business problem we're solving? Is there a competitive and imminent threat that we have to fundamentally be aware of? Is it to be innovative, or is there just thermo? Kicking out what you deploy actually has to align with the why of these five that I just talked about. Because if it's the board that said so, you've got to pick the silliest and simplest use case, because clearly the company is only doing it because they got a top down mandate. And so the needed investments that may be required for something that's more complicated just isn't going to go into it. And you're going to be all in blood, sweat, and tears, and you're going to want to put nights and weekends and making something work when at the end of the day, their heart's not into it.

Sol Rashidi [00:15:59]: It was a board mandate, so you got to match the use case with the why. But if there's a competitor, right, if there's a real problem, now we're starting to get, you know, the wheels in motion. We're getting warmed up. And if it's fomo, I would also say, suggest a simpler use case. So I always say, think big, but start very small, and then you can decide if you want to scale quickly. The third one is around setting expectations. It's a thing that I have, but I really don't like the word artificial. I don't think there's anything artificial about artificial intelligence.

Sol Rashidi [00:16:33]: It's still fingers to keyboards. We're still fundamentally training, tuning, doing everything behind it. And so one switch that I've done when I presented the boards or the executive steering committee is, don't think of it as artificial intelligence. Think of it as AI. But the a could stand for automated intelligence, augmented intelligence, or anticipatory intelligence. Automated intelligence is all around the automation, taking mundane tasks and automating it augmented intelligence is being able to do more. I love the customer service use case fundamentally. Like if you're a company, let's say you're CPG or retail, and you sell 80,000 skus, you can't expect your customer service reps to remember every single detail about those 80,000 skus and what's vegan versus what's not.

Sol Rashidi [00:17:18]: And so being able to augment their knowledge set so that they can do a quick search and understand which of the products are vegan, that's augmented intelligence and then anticipatory is being able to be able to understand the peripheral of all the different data sets and predicting patterns in order to be able to help you make business decisions that wouldn't ordinarily be seen with linear regression models or descriptive analytics and all the other fun stuff. So I'm not Debbie Downer by any means. There is a level of excitement, but I take that word artificial and reset it to either automated, augmented, or anticipatory, depending on the use case, and it becomes a lot more palatable. I would say the next, the fourth one, and forgive me for the typo, is communications. Take a minute and just read this. For any of us who work in ad tech Martech or we're developing products for the marketing team, and I present this because this is how we sound when we're in front of customers, clients, and executive stakeholders. Just close your eyes and let me read this to you because this is what they hear when we talk to them. Our marketing plan is simple.

Sol Rashidi [00:18:32]: We just need to focus our CRM and CMs on ABM for higher CLV and lower CAC while AB testing our CTA and UX for CRO and hope we get some wong. But don't ask me about ROI. If you actually went through the exercise and closed your eyes while I was talking, this is how our audience hears us. Unless you're talking the CTO, unless you're talking to another engineer and you're selling to another startup or small size company, they may or may not have the budget. The people that do have the budget and want these capabilities oftentimes don't use our language. And so you always, always, always have to understand their language and communicate the benefits, the value, the capability, and why what you're building rocks in their language. Otherwise, what I just read to you is exactly how we sound, which is why that gravity and that emphasis oftentimes is lost to them. Selecting your use case this one's a sticky one.

Sol Rashidi [00:19:34]: This one's a little bit provocative. But when it comes to artificial intelligence, and I've actually adopted this for products as well, any product, data product, consumer product, definitely AI product. I actually don't use business value as the primary means of selecting a use case. I'll give you a few reasons why. One, a lot of assumptions go into developing those numbers. Those assumptions may or may not, excuse me, be true. So when someone from manufacturing comes to me and says I want to do this use case because this is the potential ROI and business value from it versus someone in supply chain versus someone in procurement versus the division head of a brand, they've all embedded assumptions into their calculations and coming forward with the business value, that could be completely bogus. That's one.

Sol Rashidi [00:20:25]: So assumptions, there could be truth and there could be not truth. The second is what's business value to manufacturing versus supply chain versus a brand? We're not in a position to determine which business value is a higher ranking or not. And you can actually get yourself into an emotional conversation and inadvertently and indirectly you can actually create some enemies along the way. Believe it or not, if you were to choose a business case that isn't with manufacturing supply chain, but you decided to go with the brand president and that it's funny, it's like a subjective conversation, tried and true, and it can get very emotionally charged. So I created this framework when I was back in IBM because I found myself in awkward situations. Instead of business value, go through these ten different qualifiers into two categories, criticality and complexity. The criticality of the use case for the company's longevity. Is there a competitive click? Is there market consolidation? Are there government regulars, regulations and fines? Exposure and press? And business value is a measure, but it's not b measure.

Sol Rashidi [00:21:31]: And you go through and you rank each of these and you apply a weight, then you go through your complexity to deploy the stakeholder that brought this to you. Are they going to be involved and are they a good partner? Will they be engaged? Will they make time for you in their calendar bi weekly? This is really important. We often forget ourselves to ask that. But having a good partner on the other side is so key, because they're going to clear the Runway for you. The second, the people you need, are they going to be available or are they on eight other projects? Because oftentimes it's the same 20% that often gets allocated to things. Can they even have the possibility of squeezing this in? We all know about data accessibility and integrity dependency on other functions. How many other teams do you have to depend on with this use case? Fewer teams the easier it is for you to deploy, the more teams, the more complex it's going to be. Because why just basic scheduling and calendaring and getting their attention and alignment.

Sol Rashidi [00:22:28]: Because nowadays decisions are made based on consensus and not consideration. So these things create drag and lag. And then of course, you know, infrastructure. Do you have the basics to be able to even explore the use case? You also rank these. And when you rank criticality and complexity, then you map it on a quadrant. Things that are low criticality and highly complex, they're no brainers, they're non starters. Don't even start. Things that are very critical.

Sol Rashidi [00:22:54]: Lower complexity, those are your green lights. Go for them. And then things that are highly critical but also complex in terms of deployment, those are things you're going to have to plan for. I wouldn't start out with those use cases yet, because you've got to develop the muscles with the lower complexity. Use cases before you start taking higher. Then we get into accuracy and expectations. Watson beat Ken Jennings in jeopardy. If you guys saw that in 2011 with a 71% accuracy score, like, that's a c minus, the machine was better than the human with c minus.

Sol Rashidi [00:23:26]: That's kind of crazy. And I remember two years ago when we used to run, there was always this funny image with the machine learning models. You know, couldn't tell the difference between a Chihuahua space and a blueberry muffin. But now it can tell the difference between Chihuahua space, blueberry muffin, chocolate chip cookie. It's gotten really sophisticated. About a year ago, I typed salmon in water. Contextually, it was still getting it wrong. I was expecting the fish.

Sol Rashidi [00:23:48]: It gave me a filet that's been course corrected, so you can see the advancements in sophistication. But a few months ago, typing in eggplant without context, it actually has no understanding. So there's still some expectations that we have to set. But no matter what, these are hurdles that could be overcome, because anytime someone comes to me and says, yeah, but it's not accurate, it hallucinates, it's biased. So humans are biased. Sometimes we give a point of view and hallucinate without any real facts support it. And when's the last time we actually measure the human error rate so that we can compare it to the machine error rate? I hear these words, but they're not registering, because when's the last time we actually took into consideration how often we get things wrong, and then we can do a side by side comparison. And oftentimes I'm in the argument with that one.

Sol Rashidi [00:24:36]: Recruiting and talent. If you guys are selling to enterprise or working with enterprise or in enterprise, the division of labor is really confusing. Who ultimately has the budget, has the buying rights? CIO, CTO, the CDO, the CAO, whatever the new CXO is, there's never a lead, and you have to assign a lead. With these projects, we like the matrix way of working. You got to have a captain on the ship, you got to have a driver in the car. And organizations sometimes want to create a committee and a function, but with cases like this, it gets really hard. You need a lead to keep the momentum going. You need specialists on these types of projects.

Sol Rashidi [00:25:17]: Generalists won't do. So if you're at a client side or customer site and they just don't know the space, and they've applied a bunch of generalists, they're really good at figuring things out. You're going to need some support. And then there's the beware of experts. People who say they're experts, but they're learning and earning at the same time. That's my motto for them. They're learning on the job. And after they've learned and earned on two or three gigs, by the fourth pick they may have enough to contribute.

Sol Rashidi [00:25:43]: But you got to be careful of the so called experts. And then I kind of jumbled these ones on infrastructure. And it's not that whether you're on Microsoft or Google or aw, it really doesn't matter what cloud provider you're on, because all of them have services that can help. The problem is we're now being pressed and, and connecting all those services together to be able to create these deployments. And that's a new muscle for many of the cloud ops team or many of the DevOps teams. The second is with budget, we often only budget for the POC. We don't actually budget for things going into production. And I have gotten approval for moving forward with things in POC only to find out this is why of the 200 some, they don't have the budget or the new shiny toy kicked in.

Sol Rashidi [00:26:29]: And I don't have production budget to push things into production. So I always make sure to give them the full picture of what it's going to take to kick around the idea, establish a use case and actually do the POC and then what it's going to take to push it into production. And then my favorite is buy versus build. You're going to come across a lot of departments who love to build, whether or not they're qualified to do it. And this picture of Jason Momoa is my favorite. Buy the stuff that's already been figured out. Things are just really cheap right now. You guys are all starting amazing companies.

Sol Rashidi [00:27:00]: You solve those problems. You got to be able to visually and explain the hurdles of if they were to try and build this internally, what they're going to come across, because there's just a lot of heartburn and roadblocks along the way. Now, data is a sticking point. It's never going to be perfect. I'm okay with that. I've made an entire career out of data never being perfect. But remember number four, in that framework of using complexity and criticality, and it's about data accessibility and hygiene, what I can do is increase my probability of deploying successfully by picking use cases where data hygiene is in tier one, where it's good ish, it's not spotty, it's not okay ish, it's good ish. And the reason it's good is sometimes they have an MDM or an EDM team who's constantly monitoring the hygiene, or it's data sets that if they were to get wrong, there's a high cost associated with it.

Sol Rashidi [00:27:55]: High fines or bad pr, raw materials, bill of materials, ten ks, contracts, event logs, opt in, opt out, e commerce generated data. So you're never going to get perfect data, but you can categorize the data sets into tiers. And I always say in that complexity, don't pick a use case where the data is spotty. At a minimum, pick the use case where the data is british. And then the last is, if you take a look at the countries that we talked about, China, India, United Emirates, a lot of those societies, I would say, can enforce, like the companies can enforce, the governments can enforce, the entities can enforce. We're doing AI, so follow. Right. But we're very much a democratic society.

Sol Rashidi [00:28:41]: We're very much decision making through consensus and alignment versus consideration. And so if you think about the AI pyramid of scope, and by the way, a lot of these principles and thoughts are in the book, we're not a top down society, so we literally have to go through every single layer to be able to come up with an understanding of why we need to do something and the reasons why, and our biggest hurdles, we just have to to constantly sell internally. With that said, thank you for this time. I hope it was helpful that 70% of all the non technical stuff you have to deal with, I aggregated 1314 years of experience in a few slides. I know it was fast, but hopefully you were able to pull away some, some great nuggets. All right, Demetrius, it may be me, or maybe you've muted me, but I can't hear you.

Demetrios [00:29:48]: I muted myself. I did.

Sol Rashidi [00:29:50]: Ah, there you go. We don't want to hide that beautiful voice.

Demetrios [00:29:54]: Classic virtual stuff happening. So there's a cool question in here from Skyler that was saying, like, when it comes to the specialist versus generalist type thing, I think one thing that is interesting to, to harp on is, do you see that being different in different phases of a company's journey or of a project's journey, that type of thing?

Sol Rashidi [00:30:26]: Yes. Fundamentally, I think part of the challenge is sometimes the folks who are making the decision can't tell the difference between challenge and a specialist, and they tend to pick favorites or people that they've always worked with in projects. That's just not an approach. You kind of need someone who's going to be in there and be radically honest, not rude. Sometimes we can be rude. No, be radically honest without being rude of, like, this isn't going to work of this. Or if we're going to try this approach, we've got to do these four other things and 100%. So it's funny, I have this one chapter in the book, and it's around the ten AI archetypes, and it's the characters that you're going to constantly meet, the curmudgeon, the naysayer, the cheerleader, the evangelist, and their character profiles.

Sol Rashidi [00:31:13]: And every phase requires a different set, set of talent and skill sets. You're not going to invite the curmudgeon, that negative person who's been in the company for, like, 20 plus years or is a master of the craft but has, like, the uncle Rico syndrome. He did. He or she did something really cool 1015 years ago but hasn't invented anything in the past two years, like that character. You're not going to bring them in the ideation phase, the strategy phase, the use case phase. You're not going to even bring them into the conversation. Scoping phase, doping and pivoting to solution. That's the person you're going to want to bring in to uncover every blind spot that you guys ignored along the way, because what are they really good at? They're really good at saying no because, and so you actually want that person on your team, even though it's difficult to work with him or her because they're able to uncover blind spots or unearth truths in your assumptions when you're about to pivot from scoping to actually solutioning because their job is to uncover dirt, and they love that.

Sol Rashidi [00:32:16]: And so that's like an example of different people, different traits, different skill sets along each of the phases and how to operate with each one of them. So even if someone's pissing you off because they're just angry all the time or rude, they actually serve a very purpose, but know which purpose and when to introduce them.

Demetrios [00:32:33]: So I like this, how it's throughout the life cycle, you want to be thinking about when to bring in which stakeholder and for what reason.

Sol Rashidi [00:32:45]: The stakeholder will stay consistent. It's the talent that you need to unlock and complete each phase in a healthy way so you can move forward in the next phase. So that question around generalist versus specialist, your specialist is you're not going to bring DevOps in ideation. You're not going to be hog ops and strategy. You're just not going to do that. You're not going to bring the curmudgeon, even personality specialists, like, it takes a certain gene to be angry all the time. You're not going to bring that person when you're inspiring and ideating and selling. So in that life cycle, you have to know which generalist and specialist you're going to bring in what function.

Sol Rashidi [00:33:22]: But the executive committee, the stakeholder group that stays consistent on top of whatever program or project you're deploying.

Demetrios [00:33:29]: Incredible. So thank you so much. I know you got a flight to catch, so I want to make sure you don't miss your flight. And this is the complexity, the graph that you had on how to choose which use case I thought was especially, it just hit home because there's so many times that we spend so much time on the wrong use cases and don't realize it until after we spend the time on it. So just breaking apart and really looking at it through that lens of what's the complexity? And I can't remember the other one now I gotta go rewatch this criticality.

Sol Rashidi [00:34:05]: Criticality.

Demetrios [00:34:07]: So, yeah, how critical is it to the business and how complex is it that just puts things in a different lens?

Sol Rashidi [00:34:15]: And, you know, pardon my french, I got my ass handed to me the first time, a few times because I was like, why is everyone angry at me? Because I got told which use case to deploy and manage, why are they angry at me? And like, I was like, okay. One, this is emotionally charged. Two, when I actually went through the business value calculations and we turned around, they were always disappointed with us that we weren't able to match the numbers that they had originally provided on a slide, I was like, these numbers actually were impossible. Who came up with this? And then you don't even know who came up with it. And then when you actually are like, no, I need to know. How did we say 33% increase in productivity? It wasn't the stakeholder. It wasn't the vps that's reporting to the stakeholder. It wasn't the directors.

Sol Rashidi [00:34:56]: It was some analysts on the next spreadsheet number crunching best guesstimate, they felt right. And then they, and I'm like, so you're gonna hold me accountable for some person's Excel spreadsheet number crunching assumption exercise? Like, no, no, no, no. We're not doing that anymore. Yeah, but it took a few times falling on my face. I figured it out.

Demetrios [00:35:14]: This is how we found that number. Good to me. Yeah, totally. It was like, yeah, incredible. Well, so awesome. Next up, we're going to have Jesse come on. And he's going to talk about a survey that he did, talking about all the data teams and the value that they're providing. So it dovetails perfectly into what you were just talking about out.

Demetrios [00:35:39]: But a huge thank you to you, and I really appreciate you taking the time and making sure this happens, even at the airport.

Sol Rashidi [00:35:49]: Always get it done. Thank you for having me. Joe, you're amazing.

+ Read More
Sign in or Join the community

Create an account

Change email
e.g. https://www.linkedin.com/in/xxx or https://xx.linkedin.com/in/xxx
I agree to MLOps Community’s Code of Conduct and Privacy Policy.

Watch More

Lessons Learned from Doing MLOps within E-commerce
Posted May 20, 2024 | Views 328
# MLOps
# E-commerce
# Netlight
Lessons Learned from Hosting the Machine Learning Engineered Podcast
Posted Jan 29, 2021 | Views 353
# ML Engineer
# Workday.com
How To Move From Barely Doing BI to Doing AI - Building A Solid Data Foundation
Posted Dec 16, 2020 | Views 1.3K
# ternarydata.com