Chat and Chai with Megs

Tips and Tools: Improving Data and Cyber Security for Domestic Violence Organizations

May 17, 2021 Megs Shah Season 1 Episode 3
Chat and Chai with Megs
Tips and Tools: Improving Data and Cyber Security for Domestic Violence Organizations
Show Notes Transcript

Data and Cyber Security are critical for safety of the victims and advocates in domestic violence as part of our digital world. On this episode of Chat and Chai with Megs, I'm going to explore cyber security , data security, how to govern that data and what the future holds as far as artificial intelligence is concerned. I have two amazing guests that are going to be shedding some light on what tools you can use and simple steps that you can implement for your organizations right away.

Megs Shah:

Welcome to Chat and Chai with Megs, where we cover all things, policy and technology related that affect domestic violence organizations and the individuals that they serve.

Music:

[ Intro Music]

Megs Shah:

Hi Everyone. My name is Megs Shah. I am your host for Chat and Chai with Megs. I'm also the co-founder and CEO of The Parasol Cooperative. It's a non-profit organization, which provides technology tools, knowledge, and services to domestic violence organizations. So they can do what they do best, protect people. On this episode of Chat and Chai with Megs, I'm going to explore cyber security, data security, how to govern that data and what the future holds as far as artificial intelligence is concerned. I have two amazing guests that are going to be shedding some light on what tools you can use and simple steps that you can implement for your organizations right away. So without further ado, let's get to it. Hi Everyone. We have two amazing guests today with us, as I mentioned previously, we have Krishna Cheriath, who is the Head of Digital Data and Analytics at Zoetis and we have Viral Trivedi who is the CBO at Ampcus? Is that right? Did I say that correctly? Ampcus or Ampus?

Viral Trivedi, CBO at Ampcus Cyber:

That is correct. It's Ampcus.

Megs Shah:

Perfect. I am not going to do justice to their introduction, so I will let them introduce themselves. So Krishna, why don't we start with you? If you could give us a little background on you and then we'll go to Viral.

Krishna Cheriath, CDO and Digital @ Zoetis:

Great, Megs. Happy to be here. Krishna Cheriath, Head of Digital Data and Analytics at Zoetis, which is really Chief Data and Analytics and the head of Digital Strategy role. I joined, Zoetis, which is the world's largest animal health company in late October last year. Prior to that as the Chief Data Officer and the led Digital Strategy on an interim basis at Bristol-Myers Squibb and before that most of my career as a man of limited interest in triangle of digital data analytics across management consulting, in health care I also teach part-time at Carnegie Mellon in their executive education program. And the topics that I teach are digital strategy, data strategy to aspiring CDOs and CIOs. Happy to be here.

Megs Shah:

Thanks for joining us. Krishna. Viral. Can you introduce yourself and give us a little background?

Viral Trivedi, CBO at Ampcus Cyber:

Absolutely. Thank you, Megs. It has been a great pleasure joining you on this particular, session of, chat with Megs, Chai and Chat with Megs. So, I'm a Chief Business Officer for Ampcus Cyber, I've been doing this for over 20 years, but in my current role, we have been helping customers, achieve a lot of cybersecurity outcomes and we are focusing on, resiliency because everyone knows that cybersecurity, events and incidents are inevitable. So we help customers achieve those outcomes on how they can be resilient and be prepared. In the past I've worked for organizations like Ernst and Young General Electric, Verizon, AT&T and all have been doing this, in conjunction with the both industrial side of things, as well as the corporate, corporations that are working on a financial side, healthcare, oil and gas power, utilities so happy to be here and looking forward for this conversation. Thanks.

Megs Shah:

Super excited. And you know, it's interesting you bring up cyber security is inevitable. I don't know if, I'm sure it's on your radar with the whole North Carolina situation and the hacking and there's no gas. I have, one of my teammates is actually down there and, and she was telling me they have a long line of people waiting for gas. So it seems to be an appropriate topic to be talking about right now. But before we get into the cyber side of things, I think, you know, one of the challenges Krishna I'll start with you is, you know, with domestic violence organizations, they have so many clients that they deal with a lot of data that they collect. Um, and you know, what concerns me as, as someone who's been helping these organizations is data breaches, right? I think the data breaches for me are very, very important because not only are these individuals traumatized from their situations, but now if their data gets compromised, it's even worse. So, um, I guess for our audience, if you could just maybe help us understand what a data breach is. And then also maybe talk a little bit about some of the things that organizations can do to be better prepared for anything that might be coming.

Krishna Cheriath, CDO and Digital @ Zoetis:

Yeah, I think Megs. When you think about data breach to I'll elevate that to a data incident, it could be around misuse of the data that has been used, loss of data, or even incorrect data leading to incorrect outcomes. It could be any of those aspects of a data incident. Um, and I think any organization, small, medium, or large is an inevitable consequence of a digitally connected world that we live in that we have to understand that we have a responsibility towards our stakeholders to make sure that, uh, that we collect or leave the data that we totally need. We are transparent about why we are collecting the data and what we are using it for. We take it seriously, Our responsibility to make sure that that data is safeguarded and to managed effectively. And that along the lifecycle of the data, just like wine, which ages with time, data ages with time, and that you need to make sure that that data is being used for the intent, for which it was collected, which often tends to get frayed as the months go by and years go by, and then you have an effective just like if you're in an office and you have a recycling then, and you, and your dispose physical documents and throw shredders, you need to have an effective disposal of data beyond its effective use. I think all aspects of that is important because why it is important is this the trust between you and your, your stakeholder. It could be a customer, it could be a collaborator, it could be a partner. It could be any number having the fact that you're using the data responsibly and you're, you're being proactive about it. You are implementing the right mechanisms, both technology, people and process could be it to do that is an essential, minimal cost of doing business in today's digital connected world, because you only need to erode that trust once, uh, for you to lose that connection. So I think that's something that everybody, every organization has, the think seriously about, and it is not a huge capital investment. Um, you know, being very diligent about per se, if you're collecting this data, asking yourself, should you be collecting this data? Can I build, if I can do without some aspect of it, don't collect it. Am I telling them what is the reason for which I am collecting it? Am I safeguarding it? How am I disposing it? So basic measures that everybody can take and it doesn't have to be capital intensive for you to implement good data practice.

Megs Shah:

Yeah. I mean, I think that's, that's sort of the, the underlying kind of information we've been getting from organizations as we work with them is, you know, they've been collecting data maybe for purposes of reporting back to their donors. Right. And so we would caution that unless it's absolutely necessary, you probably should try to question whether that data that you're collecting is necessary or not. In many regards, um, you know, I do understand the implications of collecting the information for donation purposes because, you know, donors do have specific requirements for reporting. Um, but you know, coming back to your point about the fact that you were now all in a digital world, and we need to come back and look at how we actually collect, preserve, and archive or dispose of data, you know, do you have specific like reference guides or anything that you can maybe share with us? Um, because maybe just one or two tips would be great on how to do that, because a lot of domestic violence organizations, they're not as tech savvy, right? So small and mid-size organizations are likely grassroots organizations that, you know, don't really have an IT department. Um, and we rely on consultants and sometimes, you know, being a consultant in the past, I know that you can also pass on some misinformation, right. So, um, you know, any tips that you could give them would be greatly appreciated.

Krishna Cheriath, CDO and Digital @ Zoetis:

Yeah. I think there's some really good, um, domains that have great information around this about, uh, for example, in the case of domestic violence, the centric organization that has to report certain stats to its donors, you have forced to collect a certain amount of personal data, but you may be reporting out to the aggregated measures in terms of the number of people you are helping and supporting, et cetera. So you could implement data minimization, data anonymization that allows you to protect those information. So to get the feel of all of these different aspects of it, Dataversity(dataversity.com) is a great site. They are, they have many different data strategy, papers, practical KPIs, and measures ways implement factor mechanisms. A little bit more evolved is the Gartner CDO circle(https://www.gartner.com/en/conferences/apac/data-analytics-australia/programs/chief-data-officer-circle). It is a resource that I tap into often to get a feel of what does, what are the kinds of, uh, technologies that you can implement? What are some of the recommendations, uh, the MIT CDO IQ(https://mitcdoiq.org/), even though it sounds high and mighty has got lot of useful information that allows you to practically implement. Uh, so there are many different domains like this that helps you to look at, okay, what does it minimalist and as essential as data strategy that you can implement and what are the ways that you can do it in a very minimum bureaucracy mechanism of doing it?

Megs Shah:

We love that. We like that one a lot. Um, I mean, we, there's so much red tape to be had. And I think, you know, Viral, I'll go to you and trying to understand, I'm sure you've worked with clients that have had actually cyber attacks and data breaches happen, right. And there are certain preventative measures, and then there's some reactive measures that could be taken. Um, any thoughts on, from your perspective on, on ways to put in certain preventative measures outside of just the data protections.

Viral Trivedi, CBO at Ampcus Cyber:

Thanks, Megs. Um, so before we go into the preventative measures, I want to mention about the data topic that we were talking. I think classification of the data is very important. If you understand what type of data you're storing, if you understand the confidentiality of data, if you're taking patient data, healthcare data, we know that that is something that needs to be protected. Um, there are standards and frameworks such as HIPPA high trust that help you do that. But I want to go back a little bit on, on a people front because in domestic violence organizations, you brought up a very important point, which is these people are not tech savvy. The biggest challenge we have seen is, uh, most of these domestic violence organizations are working with volunteers or folks were bringing their own devices. So BYOD. So you don't have a good track record of how they're managing their devices. How are they accessing? Where is that laptop or that mobile device connecting elsewhere on the internet? So you're going to have those challenges when you're working with volunteers and no control over those devices. So one of the things is that if you create an environment that is allowing these individuals to access the data with the amount of least privileges for the data that they're accessing, and at least Have a policy and procedure in place that says that if you're using or accessing this information, your laptop, or your endpoint has to have some kind of protection like anti-virus or malware protection. These are just at a very high level. So from a people point of view, that is one thing. Second thing is that people have to be made aware culturally aware of cybersecurity challenges, right? Um, phishing is one of the biggest, problems we have seen where if you're in a volunteer organization, you're going to get an email and say, Hey, you can figure out how to get a grant. You click on it and you now hacked or get a ransomware. So awareness is very, very important. And third thing is that, preventative measures, around protecting the data, accessing the data, if you're using a cloud-based application, then making sure that that application has the right security controls in place, so that the, you should have access to only the information that you are supposed to be working on versus Megs who controls all the other things should have access to everything, right? So it's basically based on their title or their position. So role-based access control is very important. So these are some of the preventative data measures that I would recommend, but yeah, people problem is the number one problem. And for DV organizations, as you mentioned, it is very important to make sure that the people who are accessing this information, their laptops or their devices are secure.

Megs Shah:

Yeah. And I think, you know, there's, there's a few things I know right off the bat that we've been talking to our clients about and our members about is, you know, you need to, at least at a minimum, there's plenty of VPN capabilities now that are not necessarily tied to a specific, um, capability Norton's has one, right? Uh, as part of their antivirus protection for laptops, which you can enable it, it's not very expensive and it actually protects your laptop from a lot of other threats that could be coming in from a personal perspective, too. Right. And you know, I think the second part of it really that you hit on was a lot of the individuals don't know what to look for when it comes to phishing or don't know what to look for when it comes to, looking at privacy or data policies when it comes to software that they're thinking of using, you know, many organizations they use Excel, which is fine, it's local. But then if that Excel file is emailed to another person, then that becomes an issue as well. Right? Because now you're passing along a lot of PII(personally identifiable information) data. Um, but I do know that there's protections around, HIPAA and VAWA, that actually do apply to DV organizations. And many of them actually do have a checklist that they have to follow. But these are the things that you mentioned are not necessarily on that checklist, right? These are more practical things that we would completely overlook. Had we not talked about it in this regard? So are there things, I guess I would go to both of you on this, is, are there things that you would say organizations need to have sort of like a red flags that go up when they see certain things in privacy policies for software that they're planning to use, just to kind of make sure that they, as they think about implementing new digital capabilities that they're aware of them and that they can be proactive about asking for those from those vendors?

Krishna Cheriath, CDO and Digital @ Zoetis:

I think from my perspective, it has been, I think about the collecting of the data itself. If you don't see capabilities with a reasonable level of consenting mechanisms that gives, transparency to, to holders in providing their information around what is information has been collected and what it is for that itself is an entry point, red flag. And if you don't, if you don't have a good, and I think they Viral mentioned that they're track tracking of what, what, what are you using this data for? How is it being stored? Um, and, and what is it being used for? And you don't have any disposal mechanism and a record retention policy that is another red flag. So the basic is kind of a leading indicators of poor data practice would be lack of consenting mechanism at the point of collection and not having any effective life cycle management of it. And zero to minimal record retention policy. Following the classification scheme that Viral talked about are sufficient indicators to know that you need to take action. And then, and then you need, and then it doesn't require back robust, and massive investments, but a combination of the right people education, the right set of digital tools with these kinds of capabilities built-in and a minimal amount of data oversight will allow you to make sure that the trust that is the key to your core function is maintained the data world as well.

Viral Trivedi, CBO at Ampcus Cyber:

So I think I will just summarize what Krishna has said, because I was also about to talk the same thing, but it's how you collect the data. Once you have the data, what are you doing with the data? So how are you processing that data. Three, the ownership who owns that data, is it the client? Is it the end user? Is this the, the, the, the organization and the third is a disposable method. So once you classify all these things and how you process and who owns it, how we dispose it, I think it would make it more easier to manage and see the red flags that pop up. Because the moment you take the ownership as an organization, then it becomes your responsibility. But if you're just collecting the data for the point of, let's say research and then disposing it, then there is some level of red flag, but then you're notified the end user that this only being used for X purposes, there's an auto destruct sequence that go on. And in 24 Hours, the data will be destroyed. A lot of, lot of times, a lot of organizations forget to do that. There aren't a lot of tools available out there that lets you do that. But, collecting the data at the point of collection, making sure that the end user is aware of it, that this data is going to be used in a certain way, and then also notifying them or letting them know that this data will be destroyed, after a certain period of time, how much of how much that takes becomes into action. That's a completely different podcast we can do on.

Megs Shah:

Yeah, I think that's a pretty lengthy one there. I think one of the, one of the interesting points around the entry point of data, right? Um, even before we get to the entry point of data, there's authentication and authorization that need to be taken into account. And so, you know, I would just add that, you know, as part of any evaluation of software that you're doing, you try to understand how they're authenticating people to get access to the information. Um, and whether it's a two factor authentication, basically, meaning something like, you know, you have to enter in your email and then it sends you a text message with the login code or something along those lines. And, you know, most like Google and Microsoft, and so many other organizations have implemented these capabilities, that's not very difficult to do for software vendors. Uh, but if they don't have that, that would be something that I would, I would try to caution organizations on. Um, you know, some startups probably don't, um, but some do. And so just asking those questions makes a big difference as well. I feel, um, you know, and one of the there's, I'll take a step back. There's so much information out there, right? It's almost like an information overload when it comes to data strategy and data security and, you know, cybersecurity. And it's just, I feel like we need to kind of bring it down to the basics, right. And for the data side, we've done a phenomenal job. It was kind of just making sure there's the three tiers getting it, saving it, destroying it, right? So like, those are the three principles that you should never overlook when it comes to data. But when we talk about other things, right, because data is sort of the, the gold of the organization, but then there's the storage of it. There is the infrastructure that you're using for it. There's the network that you use to connect to it. There's a lot of other factors involved around cyber and cybersecurity. It could start as simple as your website, for instance, right. What are some of those sort of tips, Viral that, that organizations can implement that would help to make sure that they have that sort of at least the first guard up for their, um, customer facing sites?

Viral Trivedi, CBO at Ampcus Cyber:

Absolutely. So I think I'll start with the basics. Uh, as an organization, you should be aware of what you have, which is basically an inventory of all your hardware software, because, uh, there's a saying in security or in anywhere else, if you can't see something, you can't protect it. So you should have a good understanding of what you have from a hardware and a software point of view,

Megs Shah:

One question. So on the same accord, which should they have the same kind of, inventory for volunteer devices that are being used? Or is it just in-house assets?

Viral Trivedi, CBO at Ampcus Cyber:

Well, at least you should have, yes. The answer to your question straight up is yes, because if they're using multiple devices to the access and information in a common repository, which is like a cloud environment, then you will probably have that cloud environment should be allowing only those devices. So there should be some kind of a mechanism that says, okay, if I'm giving a John Doe access to this environment, and now John Doe can be using five different laptops to use that credentials to get into the cloud because it's on the internet. But then you can make furthermore restrictions that only one laptop that follows this policy has endpoint protection, malware, security, antivirus, all that stuff that should be, but again, in a voluntary based organization that is difficult to track, but I would recommend as a more mature as mature you get, you should be able to, but that should, there should be a policy for the volunteer to only use a laptop that is properly secure from an endpoint point of view. Um, but yeah, there are solutions mobile mobile device management solutions that allow you to keep track of the assets that, uh, these all volunteers use to connect. So there are tools available that can be, uh, that can help you track those devices that have and can access that environment. So that good inventory is a good thing. Secondly, you need to constantly look for one vulnerability readiness or do an assessment of your environment on a, at least quarterly basis, unless if you're doing multiple releases and multiple launches, then absolutely I would recommend before every launch, make sure there's a vulnerability assessment now that environment. So that if there are any loopholes that have found they're address before it goes live, that is the second thing. Third thing is to have right policies, procedures, in order to, as we discussed in the previous section that how, and who is holding the data, who's accessing the data. There should be policies and procedures in place, and they should be followed to the T. That is very important because having a policy is one thing and then operationalizing that policy is a completely different topic. And the last thing is, uh, doing this in a cycle. It's not a one-time thing. I want to mention that these high level topics that I'm mentioning are absolutely important to be done on a cycle basis. So either you do it every quarter, twice a year, but it has to be continuously done. You will have volunteers come and go. You have to make, keep track of who you have given access to. Oh, and this person is no longer working with you. And then you have to remove that access so that they don't take the data along with them. Secondly, once these people have access, are you, do you have the right data loss prevention tools in your environment that okay, since they're using their personal laptop, are they downloading any personal client information or are they downloading any patient information or, or, or the members of your organization, or they're taking that information and doing something wrong with it. And then user awareness have these kinds of conversations, uh, where we are talking about, uh, awareness and spreading the awareness of cybersecurity. Because remember Megs we live on social media, people I'm talking about general public. I mean, you are sharing everything about your day. You sleep, how you eat, what you eat, where did you go? Where are you all that stuff? So one has to be careful. I would, my message to your members would be that just be careful of what you put on, on the web, either on Facebook or wherever, any social media platform, uh, try to not add location, tags, disabled location types of people, not don't know where you're posting from. This is specifically for the people who are being abused and somebody stalking them. You want to keep your location tag all the time. Uh, you might want to give bare minimum information about what you are doing and why this is happening, um, because bad actors will use that against you to do cyber bullying or take advantage of your situation. So keep the information at bare minimum. And if I would have to give one recommendation, just stay away from social media. I'm no longer on Facebook. It's been 6 years.

Megs Shah:

Yeah. It's interesting you say that because, you know, we went off and we created an, uh, um, sort of a community for advocates to kind of come together and it's not on Facebook. So, um, you know, we wanted to stay off of, uh, a lot of the social media channels to do this, but, you know, Krishna, you work in a highly regulated environment, right. And, you know, Zoetis, is a pretty big company compared to a lot of the DV orgs we work with, but I'm sure there's certain measures that can translate over to them in terms of, you know, um, security and how do you ensure that that devices are accounted for, especially now with, you know, all of us working remotely, there's, there's compromised the internet networks and so many other things. What are some of the ways that the Zoetis is addressing this?

Krishna Cheriath, CDO and Digital @ Zoetis:

I think the things that can be translated, of course, when you think about big enterprises, the level of investment that is being made, to data security, data protection, cyber security, orders of magnitude, different from what a voluntary based and nonprofit organizations can do. But I think at minimally, it starts with awareness awareness about the, topics Viral talked about so eloquently around to making sure that starting with technology, that these risks exist, these responsibilities, you have these responsibilities and that, that you need to implement that itself is a step one. And I have talked to some of the nonprofits that I have advised which don't even have that bare minimum of badness. So step one, is that awareness and understand that these things exist. Step two, I think, is the things is having, you know, one of the really intriguing books that I read was Atul Gawande's years ago, checklist manifesto.

:

I love that book.

Krishna Cheriath, CDO and Digital @ Zoetis:

Everything is a checklist for me, but having a fundamental checklist approach to the things that you need to do, whether it is from a data perspective or from a cyber perspective and the repeatability of implementing the steps in that checklist. So maybe manual some maybe automatic, um, Viral talked about making sure that you have a healthy, continuous process of evaluating who all have access to the data and making sure that that is still current or what are the devices that is connected and making sure that it does that, and that is that kind of steps. These are not rocket science. These are not massive technology steps, but basic checklist driven steps that requires a discipline, a making it a priority and B doing it repeatedly. And I think those are healthy starts that we can take. And then the other last element that I think we're all touched on, the people aspect of it is educating those that is connected to you, whether they're a the stakeholders around the kind of things that you are doing and the kinds of responsibilities that you expect that they will be taking as well is also equally important. And I think that that is a great start. And that is true in a big enterprise setting. As we say that the, you know, you can have the world's greatest chief information security officer. All it takes is one employee to not know what link to click or one device to compromise certain networks. It takes a village for these kinds of things. It starts with an awareness. It is a responsibility of everybody involved. Everybody has a role to play having implementing some minimal set of steps that can be taken around, understanding, appreciating, and then, and monitoring can help go a long way.

Megs Shah:

And I, I, you know, I, I, it's a large part of what we're doing with our membership programs. So we haven't launched our memberships yet. We have our goal to do that in June. and you know, to, to Viral your point, you know, every year, what we've, what we've done is we've allowed for, as part of our membership services. And we have our network on these lead that's there, we have educational stuff like for the employees to be aware of, you know, cybersecurity and how do you build a safe website? What are the ways that you can secure some of the blog posts if you're trying to get only members to read it? and so there's certain trainings we're putting for our member community as well. Um, but the really, uh, sort of more sought after, service that, that we've learned about was a lot more to do with cybersecurity analysis, right? It's just come in and tell us what we're doing right or wrong, because, you know, if I were to look at the funding challenges that we've had, you know, as an organization and, you know, I've, I have a pretty healthy and very good network. So I've had a good support from individuals. Um, but a lot of the small mid-size organizations, you know, they struggle financially just to meet the needs of their constituents, right? Like the clients that they have. So they have to put a lot of their money against the programs. And operational side is kind of put aside, so the consulting service around, you know, doing cyber analysis on an annual basis or bi-annual basis, um, is something they can do through our membership tiers. And I'm so happy to hear that that's the type of vigilance it takes to make sure that you're not only securing the data from a policy perspective, but you're educating the people, right, about what needs to happen and to actually helping them with a checklist. And Krishna, mentioned just a minute ago that can help to at least give peace of mind that you're taking the measures. And then you can start to evolve that as you start to get like a bigger, where you may want to implement certain technologies, you know, I know Splunk(splunk.com) has some free versions that you can use, to monitor your infrastructure and your network and whatever devices are connected to your network. Right? And so these are types of technologies that could be used. And, you know, if you both, either of you, if you have tools, certain tech that you've used in the past, that could be useful, you know, please do share that with us. And we'll be able to put that into the link for the podcast, as well as the YouTube video that we push out. That way our viewers can actually, and listeners can, you can get access to it. Um, as well, the thing that I kind of want to veer off a little bit on, because this is, this is what I call the shiny object syndrome that's happening right now, around artificial intelligence and domestic violence, right. I've heard numerous times where people said, we're going to use Alexa and we're going to have the AI for Alexa detect if there's violence in the environment. And there's a risk to that. There's a lot of ethics issues as well around that, because it's a listener app. What are some of the things that, you know, you see AI could be used for in a good way, but then what are some of those ethics lines that we shouldn't be crossing as part of the tech side? So Krishna, I'll start with you and then we'll go to Viral.

Krishna Cheriath, CDO and Digital @ Zoetis:

Yeah, I think one of the intriguing articles that I've read recently is in Harvard business review by, and I'm sure that I'm not saying her name correctly is Mareike Möhlmann actually wrote an article about algorithmic nudges don't have to be unethical(https://hbr.org/2021/04/algorithmic-nudges-dont-have-to-be-unethical), it is an intriguing article that talks about the fact that we do live in a world of AI. And the age of AI is here, whether we knowingly do it when you go to Netflix and, and, uh, have been so surprised by the quality of the recommendations it makes around based on my interests. And I have enjoyed everything that it has recommended that I watch. So if we live in a visible and invisible AI world and it is here to stay, and it is only going to continue. So I don't, I'm, I'm in the campus saying that, how do we bend it to, uh, for societal benefit? How do we use it for, um, making the overall, you know, that digital society better it is, but it does require a certain level of responsibility, responsible behavior, ethical behavior, some of it connected to what we already talked about and making sure that you're using it for the right purposes for the right intent. That is not a, that is a little bit of transparency around if you are using them based decisions, or you're doing some algorithm based nudging that you are you're open about it. And then you're, you, you invite people to learn more about it. So there are things that can be done to look to help an individual make a better position or through use of analytics or algorithm or AI. So I do think it has a role to play, but those who are designing it, those who are implementing it, and those who are using it have also have to sign up to a certain level of societal responsibility to make sure that it is ethical, free of bias. It is using it for the right intent and then protect against misuse. And if we can take the steps, any technological innovation that has happened from horse to car buggies, to phones, to iPhones, everything has progressed humanity, but also introduced a new downsides to humanity as well. And how do we limit the downside and how do we use the upside for societal benefit is something that I'm keenly interested. And I'm sure it is 11 for the conversation that you're actively pioneering. Megs

Megs Shah:

Viral, What are your thoughts?

Viral Trivedi, CBO at Ampcus Cyber:

Um, well, see, to develop any kind of new technology as Krishna just mentioned. Um, we are, we have come a long way from, from rock to the wheel, to the buggy, to the horse, to the car, to Tesla engines, right.

Megs Shah:

I drive the rock around, and I'm just saying,[laughter]

Viral Trivedi, CBO at Ampcus Cyber:

I'm we've grown and there are always going to be some challenges with, uh, as technology advances and, and in the end, I think there has to be some level of humanized, human centric, artificial intelligence that needs to be focused on rather than just relying everything on the AI or the algorithms to do it. Um, and as we develop the trustworthy technology, We must also understand how AI interacts with humans, as well as, with the vital social structures and institutions that are building this. Like I've, I've, I've read articles about Stanford. Stanford is working on a human humanized, AI, platform. Uh, I've seen IBM doing this. And as more and more, the more of these organizations interact with these new technologies absolutely as Krishna mentioned that, uh, ethics is going to come up. So when I was working at GE, we had, uh, we had called our so-called conference, minds and machines. It was a industrial, IOT(Internet of Things) platform where you're taking all these artificial intelligence, collecting the data, doing predictive maintenance, and then everybody started making big deal about it. That way you're telling me that the environment that was siloed for four decades is now going to be connected to the internet. And some kind of AI is going to tell me how to produce milk, those ethical questions, those, uh, challenges, those fears, all are going to pop up with new technology, but at the same time, the challenges of how to do it properly securely is continuing to grow as these new technologies come into play.

Krishna Cheriath, CDO and Digital @ Zoetis:

I think it makes totally additional say is there are some universal principles that are, that can be looked at if you're thinking of applying an AI or an algorithm to in, in, in your world, that's it to subscribe to some of these basic principles such as highest priority is to respect the person behind the data. How do we make sure that you're matching your privacy and security safeguards with privacy and security expectations that you and I would have, and any individual would have, always follow the law, but understand that the law is often at that minimum bar and law is always playing catch up, be very of collecting data just for the sake of sake of more data. And for that intentionality and purpose behind it, can you mitigate the spatter inclusion and exclusion impacts of data? And then you have a mechanism that the center, the impact of this algorithm and the AI on the community that it has impact to them and explaining, and being transparent about that, about what they're removing the opaqueness and black box around it. There are some universal principles that can be followed so that you're taking the necessary actions to make sure that you're limiting the downside and then trying to reap the benefits of the upside of using some of these capabilities.

Megs Shah:

I think it's also, you know, we've got to be more cautious about training the AI, right? Because, you know, when you, what you don't want to create, I think you mentioned this initially is it's free of bias, right? It's very difficult to do that sometimes. As part of the data we collect or how it's collected, in the domestic violence space, it would be pretty skewed, but I do see two use cases that could be interesting to, to sort of look at within the DV space. And it's very, maybe, you know, minimal, I would say in terms of what, what you really want to be able to do without really hampering, someone's privacy, right? First one was around really there's, there's a language barrier that a lot, a lot of DV victims have that they need to overcome and organizations traditionally have struggled, you know, to sort of help, translate in person or in a document form. And I feel like AI could play a pretty big part in trying to help offset some of those challenges. Will it ever be a hundred percent? Probably not, because I think linguistics in itself is pretty challenging to try to, you know, get completed, be accurate just yet. I'm sure that over the years we will improve it, but, you know, how you pronounce something, how the tone is the words, the slang terms. There's a lot to consider around that, but I do see that as a pretty solid use case, in the DV space. Um, um, the second one that came to mind as, as both of you sort of highlighted is around, you know, the nudges, right? So organizations traditionally, you'll find a surprising, or maybe not. BBut domestic violence is actually seasonal, which is, very interesting finding that I had in my research. And I scratched my head. I'm like, how is this? This is not like a holiday season. Like, and surprisingly that is when it peaks. But, um, but the challenges that they run into as organizations is understanding the, the region resource planning that's needed around it. So could we use an AI methodology, to, you know, it could be bare bones, predictive analytics that we do on the data. That's there to say, okay. You know, based on last year's sort of curve of responses and requests that came through, these are the types of requests that you're seeing most, and the types of resources you'll need for on a head, because I feel like that that could help inform a lot of their program planning as well as their donation collections that they do and the timing of the year that they do it. Cause that could be an interesting use case to try as well. But I do feel like there's a lot more to be done from now until when you truly embrace it within domestic violence. And, you know, I get worried when I see, there's a couple of organizations starting ups actually that have tried to put into this whole Alexa thing that came up and, you know, I had alarm bells going off in my head because that could be used in the opposite way, in case so DV, right, because of domestic violence, it's all control and what the individual is, allowed to do. And so if you have a listener at home, there's a high chance that it could be used in, in proper. And a misuse of it could, could actually have detrimental costs for the victim.

Krishna Cheriath, CDO and Digital @ Zoetis:

But I think they use cases that you've started with around. How do you solve for the language barrier and the AIB and enabler, which is a contain the use case of a particular problem in, in, in the overall. I think that, that that's an intriguing one. I think there are, you know, I came across a startup, it's called jnani. ai. And it does a remarkable, uh, how it was able to in, in Indian languages that are so many different varieties, so many different dialects that are to the effectiveness of it, to be able to translate into a common, and then be able to use that and to cross the language pattern I was surprising results. So I think that could be a contained use case to help solve the barrier and then bridge between the victim and the help and, and let AI help in language, not becoming a barrier for somebody getting out. It's a great use case.

Megs Shah:

Yeah. I actually know an organization that's doing something similar. I don't know if they're continuing with the work or not, but I know they started out this way. It's Sama(https://www.sama.com) Is the name. Um, and you know, they've been helping nonprofits, um, with human and AI translations, which is equally important, right? Like I think it has to be, as we all talked about, there's, it's an augmentation of what it does. It's not a replacement of, right. You think it's important to keep that in mind because you know, we can't, we can't do something without having some kind of an intervention for validation, at least not yet. Right. The technology's not quite there yet. And I think it's a valid point to try to make sure that we are all realistic about its capabilities and what it can do at this point, um, as well. So, you know, as we come towards the tail end of this, I want to open it up to both of you to sort of provide any lasting thoughts and, and something that our users or listeners can kind of take away from this session.

Krishna Cheriath, CDO and Digital @ Zoetis:

I think though, the, the, the thought that I leave behind for the listeners to say data privacy data at tech cybersecurity, all this can feel overwhelming, but you can start small. It doesn't, it starts with awareness. It starts at implementing the minimum mechanisms that you need to protect yourself and they recommend not being old for the topic and then seek help. But it is important because you are in the trust. Your trust is a key part of the equation that you're part of and thinking about cyber and data trust has other elements of the question is only going to amplify and improve the kind of impact that you can make. So I recommend you to consider, start small, take it serious and take them one or two effective steps in the right direction towards data, better protection of data and cyber.

Megs Shah:

Fantastic.

Viral Trivedi, CBO at Ampcus Cyber:

So, yeah, as Megs, thanks Krishna for that point, that 90% of cyber attacks are successful because of some kind of a human error. Uh, this is a statistics by Gartner. I've seen this. And so my, my takeaway is that as an end-user cybersecurity starts with you as an individual, um, change your passwords every 90 days, uh, be careful of what you post and where you posted, uh, make sure that you are constantly looking out for suspicious links or suspicious emails or phone calls. Now, reason hacking are also happening through phone calls, a lot of fraud, fraudulent phone calls. So just be vigilant about that. Be careful and make sure that you are not accessing any kind of private information, your banking or, or, uh, any personal stuff in a publicly available wifi network. Uh, if you think you are going to Starbucks and connecting and checking your bank account statement, it's not secure. So those are the few takeaways, change your passwords regularly, don't access private information in public environments or public wifi, uh, constantly keep yourself aware and, and I will share a Megs with you, uh, certain, uh, brochures and one-pagers that you can share with your members on, uh, how to create a strong password, uh, one and two, some bite-sized videos to be able to identify a phishing attack.

Megs Shah:

So, setting your password as password doesn't really count, right? Uh, you cannot do that. You cannot do that. You know, I will say it's interesting. You mentioned about, um, being vigilant about the emails, right? And because of human error happens from that perspective, there are, there is a simple way to check if the email is legitimate or not. Um, and the way you do that, as you open up the header of that email, because in the header, you will actually see the URL where it's coming from. And then you can actually go to that URL to see if it's a legitimate organization or not, and research a little bit more about it. And if you find that it isn't, and you see that, that organization's name is being used, reach out to that organization, let them know that someone is using your name and out there telling people that they're gonna get grants or telling people that they're going to get money out of something. Because those are the things that really help is, is not just the security within our own organization, but also for those organizations that are supporting us in the long run, right? Because, you know, they may have had a data breach, they may have had security breach and they may not even realize it. And so at the moment you recognize that you should bring it up as well. So I would, I would leave us with that thought, but I want to take an amazing, thank you to both of you for being on this. I think this topic is such a huge deal in domestic violence because at the base of the core of everything we do is trust with our clients and with our customers that we're dealing with every day. And I hate to call them clients and customers, but unfortunately that is the case because they have to, they have to be dealt with, with various services and things that we do. So I really appreciate all the, all the tips that you've provided. Um, and for all the listeners and viewers, we're going to go ahead and provide links to all the materials. If you, either of you have any additional content that you want to share with them, please do send it over. We'll be able to put that in there. And you know, if there's additional sessions or specific questions that any of you have, please be sure to send it to us. And we will do another session around this if we need to. And we can just certainly dive into more specifics with either one of you, if you're still open to doing another session. So appreciate it. Thank you again, and have a wonderful rest of your day.

Music:

Thank you. Thank you.[inaudible].