Search

Regulating Technology: Protecting Human Rights at the Border – Petra Molnar – S8E26

Borders are a legally unregulated space where human rights are often not protected. To help us explore these liminal spaces, we invited Petra Molnar, the Associate Director of the Refugee Law Lab at York University and Faculty Associate at the Berkman Klein Center for Internet & Society at Harvard University, onto the show.  

So why should you be listening in?  

You can hear Rob and Petra discussing:  

  • The Migration & Technology Monitor 
  • Common legal issues around migration in America 
  • AI discrimination  
  • The criminalisation of migration  
  • Witnessing and storytelling 

🤖 Want to learn more? Head over to this episode on Perplexity for AI assistance to answer your questions. Click the link above to explore the topics discussed in this episode in more detail.

 

Transcript

Robert Hanna 00:00 

Welcome to the legally speaking podcast. I’m your host Rob Hanna. This week I’m delighted to be joined by Petra Molnar. Petra is a lawyer and anthropologist specialising in migration and human rights. Petra has been working in Migrant Justice since 2008. With experience as a settlement worker and community organiser, she has worked in Jordan, Turkey, the Philippines, Kenya, Colombia, Canada, Palestine and different parts of Europe. Petra is the CO creator of the migration and technology monitor, investigating the technological experiences of people crossing borders. She is also the Associate Director of the refugee law lab at York University and a faculty associate at the Berkman Kline Centre for Internet and Society at Harvard University. Petrus articles have been featured in The New York Times The Guardian, Al Jazeera and just security. Earlier this year, Petra published her book, wolves have eyes surviving migration in the age of artificial intelligence. So a very big warm welcome, Petra. 

Petra Molnar 01:03 

Thanks so much. Rob, I’m delighted to be here with you.  

Robert Hanna 01:06 

Oh, it’s an absolute pleasure to have you on the show. Before we dive into all your amazing projects and experience to date, we do have a customer icebreaker question here on the legally speaking podcast, which is on a scale of one to 1010 being very real. What would you rate the hit TV series suits in terms of its reality of the law on a scale of one to 10? If you’ve seen it? Oh, 

Petra Molnar 01:30 

I have seen it. And in fact, as somebody who spends part of my time in Toronto and Canada, where the suit, the show was filmed, it holds a really special place in my heart. But in terms of reality, you know, I think it depends on episode, but probably around like the two out of 10. Three out of 10. Mark, it’s definitely a very Hollywood eyes version of the law. 

Robert Hanna 01:50 

Yeah. And I think we’re getting a lot of twos and threes recently in terms of people that yeah, it’s good for certain aspects of entertainment. But in terms of the harsh realities, probably not quite there. But we should move swiftly on to talk all about you. So would you mind telling our listeners a bit of a about your background and career journey? 

Petra Molnar 02:09 

Yeah, I would love to, it’s definitely been a career journey of many twists and turns. And, you know, no one’s more surprised at where I ended up than me to be honest, because I never planned to be a lawyer. I’m a former classical musician, I was training to be a flautist, my entire childhood all the way into university. And then I had a massive change of heart. I kind of always felt a pull to social justice issues. And I felt like with music, you know, it’s a very, very important area to work in. But I kind of felt a disconnect. I was always wondering, you know, how can I be more engaged with with some of the human rights and social justice issues. Also, given my background, as somebody who has had, you know, my own fair share of difficult experiences and migration journeys, I kind of felt a connection to working in immigration and refugee issues. And so I went to grad school for Anthropology and then to law school with the kind of laser focus of wanting to focus on these issues. But yeah, never, that was never really, really part of the plan. I thought maybe I would work in an NGO or more kind of in like the academic space law just kind of happened. As a happy accident. 

Robert Hanna 03:19 

Yeah, definitely a happy accident. And you’ve achieved a heck of a lot. And I want to kind of jump a little bit more into the work that you’ve done as a lawyer and of course, and and follow up just if I get my words out so specialising particularly in sort of human rights and migration. So can you just give us a flavour of the type of work that you’ve been getting up to?  

Petra Molnar 03:39 

Sure. Yeah. So I was working in the more kind of, I guess you could call it traditional areas of immigration and refugee law, like gender issues, immigration detention, but nothing to do with with tech, you know, that was something that I fell into very randomly back in 2018, when I was working as a research lawyer after about two and a half years of practice in court, which nearly killed me if I may say, so I’m not a litigator by by, you know, just, it’s just not a good fit for me. So I was kind of transitioning towards policy and academic work at that time already. And then through again, a series of misadventures. This seems to be a theme in my life. And it fell in this area of trying to understand the intersection between technology, immigration and human rights. And we ended up writing a little report in Canada, where I was at the time looking at the way that algorithms and new technology are being introduced into the mix. And then because I’ve always had this kind of global lens to try and understand these issues from I thought, well, what’s happening across the world, and then the project and the work organically grew out of that. 

Robert Hanna 04:48 

I love that. And again, I want to dive a bit deeper because this is really meaningful and impactful work and that’s why I was so keen to have you on the show today to sort of dive into that but as the sort of CO Create of the migration and technology monitor investigating the sort of technological experiences of people crossing borders. Could you tell us more about the collective? 

Petra Molnar 05:10 

Yeah, I’d be delighted. This has been probably the most important thing that I’ve really had the privilege to do in my career is to start this initiative. It’s, it started off as an archive and a platform to hold some of the work that’s looking at surveillance and border technologies and these other really high risk projects, but it’s grown into more of a community. And we thought instead of kind of replicating yet another Western based institution, you know, of experts who kind of sit in their arm chairs and study the world, we want to platform and support people on the move to tell their own stories on the impact of technology and surveillance and borders. And so we incubate projects from colleagues on the move, who are currently in refugee camps or at borders, and who are doing really groundbreaking work. In this area, we started a fellowship programme, for example, last year where we fund five people directly with a living wage for a year to be embedded with us. And we were lucky enough to be able to do that again this year. And it’s been Yeah, it’s just been really marvellous to also try and think about how we work differently in this space, you know, when who counts as an expert, and who gets to be around the table, when we talk about what we innovate on and why, especially when it comes to really high risk technologies that are impacting real people. For us, it’s been a really great way to make space for those voices that are doing critical work on the ground.  

Robert Hanna 06:38 

Yeah, and it’s such, again, valuable work that you’re, you’re doing, and you know, I commend you, because it’s and also you talk a lot about community. And that’s something I’m very passionate about as well, because the power of community, so much can be achieved as you’re better together. I just want to kind of share this, you know, you say, our primary goal is impact impact to share knowledge, destabilise structures of hierarchical power and create space for pasture to pre work at the intersection of migration, technology and human rights. So what projects have the migration and technology monitor been part of over the last recent times. 

Petra Molnar 07:15 

So we on our archive, which you can check out, it’s actually available in multiple languages and French, Spanish, Arabic and English, we house some of the work that we’ve been doing, for example, we have an interactive map of all different border technologies around the world, kind of doing the monitoring or tracking element of our work. But actually, the work that I want to highlight is, again, the work of the colleagues who are currently in displacement or on the move. And the first cohort finished up just a month ago. And it’s been really fascinating to see the breadth of projects, Rob, I mean, it really ranges from, you know, my colleague, Veronica Martinez, who is from a border community on the Mexico side, in Ciudad Juarez. She has been looking at, for example, the use of facial recognition technology through the so called CBP, one app that the border patrol in the US is using to surveil people on the move. She’s been doing really amazing work on this. You know, half of our colleagues look at the kind of sharp edges of the technology, the surveillance, the drones, the AI. But there’s also projects that look at technology as a way to empower mobile communities and to think more creatively about how we can build a different world. For example, our colleague, Simon draughty, he’s a data analyst and a refugee living in Uganda, and he created this amazing programme called the memory scroll, which is basically a digital archive for his mobile community, I would really urge you to check it out, because it’s really fascinating to also look at Tech as a way to strengthen people’s human rights and make their experience better.  

Robert Hanna 08:48 

Yeah, and again, it’s just fascinating stuff, just listening to what you’re doing. And it’s very progressive, too. But, again, whenever you’re trying to do these things, there are going to be challenges and an issue. So what are the most common issues arising from migration and technology in America? And indeed, around the world? 

Petra Molnar 09:06 

Yeah. I mean, this is a global global issue, for sure. I mean, we’re seeing the proliferation of technology, across the different borders, in refugee camps. And even as people are crossing other boundaries and moving between states, we’re talking about, you know, very high risk surveillance technology like drones, automated cameras, and even really, really kind of outlandish projects like robot dogs, which were recently unveiled at the US Mexico border, or AI lie detectors that the European Union was experimenting with biometrics that are being tested out on refugees, Visa triaging algorithms that might separate you from your spouse for many years, all of these kinds of technologies. One of the key issues is that they’re being introduced without public scrutiny, or very little accountability really. And the host of human rights issues at play is vast. I mean, from on privacy issues and data sharing, that is often inappropriate to discrimination and equality rights being impacted when we’re dealing with facial recognition technology that’s explicitly racist, for example, against darker skinned people, you know, to a whole host of other more perhaps subtle, but equally important things to think about, like the right to appeal. But what happens if you’re deported as a result of an algorithm? How can you meaningfully challenge this? One trend that I’ve been noticing across all the different borders that I’ve worked at is that there’s this kind of obsession with techno solution ism, right, like using technical solutions for very complicated things. And often, the way that these projects are rolled out, just does not take into account the human impact that they have, because they’re hurting real people. And we need to have these conversations at a broader level to really get out all the impacts that are being experienced currently, ya know, 

Robert Hanna 10:57 

and you mentioned Robo dogs that got me curious, you know, for people who may be less familiar, you know, what is border technology? And does that include the likes of Robo docks, DNA connection algorithms, etc? 

Petra Molnar 11:13 

Yeah, I mean, you know, we’re seeing a host of different projects. And what helps me some sometimes to think about it is to trace the journey of somebody who’s moving. So if, for example, you are planning to move, you know, there’s a host of projects that you’re interacting with, before you even leave your country, states are doing social media scraping of your online activity, for example, you know, and things like that. Then as you start moving, and you get to a border or a refugee camp, there’s a host of different technologies there, you know, biometric data collections, such as Iris, scanning, fingerprinting, drone surveillance in the sky, sound cannons that Greece was experimenting with at the turkey border, for example. And Robo dogs, those are perhaps the most draconian example, you know, the kind of quadrate pad military grade machine that you might see in a sci fi show that is now joining the global arsenal of migration management technology. And then there’s also a host of other things that happen once you already crossed into a new territory, whether that’s, you know, a visa triaging algorithm, or voice printing technology for identification, different kinds of carceral technology if you’re an immigration detention, like ankle monitoring, but again, so many of these projects are rolled out without appropriate human rights standards in place, and without any kind of oversight. Really, what are the times we find out about it after the fact? And it’s, to me, that’s one of the things that my work tries to highlight. It’s not an accident that this happens at the border, right? Because the border is this kind of free for all zone that is very opaque and discretionary, and sometimes difficult to understand. But imagine the uproar, if a robot dog was you know, walking around a grocery store or hospital. Except the thing is, it’s already happening, right? So a year after the robot dog was introduced at the US Mexico border, the New York City Police Department announced that they wanted to roll out Robo dogs in the streets of New York to quote unquote, keep New York safe. That’s a direct quote. And one was even painted like a Dalmatian white with black spots on it, I guess, to make it cute. So this stuff doesn’t just stay at the border. Right. This is why we I think we all need to pay more attention to what’s happening. 

Robert Hanna 13:23 

Yeah, and I guess that leads nicely on to what I wanted to talk about next, which is around regulations, and why are there no regulations in place to govern the development and deployment of dangerous border tech? 

Petra Molnar 13:36 

Yeah, I mean, you know, if I had to sum up the last six years of my work in this area, it will be precisely this. It’s not an accident that there’s very little law and regulation and governance, because it allows states to experiment in these ways that wouldn’t be allowed in other spaces. And the border already is this kind of free for all so much discretion, so much opacity, right, like, when you’re at the airport, people get screened for secondary processing for reasons that we sometimes don’t understand. There’s so much power, right that immigration officers have. And if that’s the kind of starting point, and you augment or replace human decision making with machine learning and AI and all sorts of different projects, he just creates this high risk laboratory. And it’s an interesting question to ask him at this moment, too, right? Because we’re seeing different jurisdictions play around with trying to regulate artificial intelligence and new technologies, the European Union’s newly released the AI act, right. The first regional attempt to govern artificial intelligence, once again, is really weak on border tech. And why well, because migration is a politically expedient issue. And to again, there’s no incentive to regulate this because the laboratory must keep churning out high risk projects. And the last thing I want to say is also the involvement of the private sector in all of this. There’s big money to be made in border tech. Right. Some colleagues like my friend and journalist Todd Miller, who does Really amazing work in Arizona, he’s been talking about the border industrial complex, and it’s a multibillion dollar kind of complex where different projects are being tested out, right? Robo dogs are being presented as a solution, quote, unquote, to migration. And so if if companies make a lot of money, why would they want to regulate? Right? There’s no incentive to do so. 

Robert Hanna 15:21 

Yeah, and it’s, it’s a valuable point, you know, particularly when it comes to the sort of money and profit side of things, but I want to stick to the human rights because what human rights issues are prevalent as a direct result of these unregulated border technology? And has this led to any loss of lives, for example. 

Petra Molnar 15:43 

Unfortunately, it has not some of the sharpest manifestations of all of this right. And I think it bears also talking about the fact that we’re talking about an area that’s underpinned by international human rights law and refugee law, like people are exercising their internationally protected right to seek asylum at many borders. And when new surveillance technology is introduced, sometimes under the guise of you know, trying to prevent people from coming, well, guess what, that actually doesn’t work. statistics don’t bear it out. I’ve seen it firsthand. What happens instead of people not coming is they take more dangerous routes, either in the Sonora desert, in Arizona, or at the fringes of Europe through the Mediterranean or a GN or even the land borders, as people are trying to evade surveillance, they oftentimes either get really hurt or die. I’ve visited, you know, grave sites in Greece, I’ve been to different graves in Arizona, in the Sonoran desert that are a direct result of this kind of sharpening of borders through technology. That’s some of the sharpest manifestations for sure. But there’s a host of other human rights issues to you know, again, equality and discrimination, when we know that a lot of algorithmic decision making is highly biassed. Right? Privacy issues too, when you’re collecting really sensitive data and then storing it sharing it in indiscriminately between different actors. Again, it’s really really sensitive stuff that we have to pay close attention to. 

Robert Hanna 17:11 

Yeah, and you touched on it there. And I want to talk about this, particularly when it comes to AI and discrimination is a big point. You know, do you believe AI discriminates against different groups in society? And does border technology fuel ascertain perspective on immigration? Absolutely, 

Petra Molnar 17:29 

right. Because AI and technology is a social construct. Just like language and law and policies. It’s nothing that we can kind of pluck out from thin air and say it’s a neutral tool. Again, when we’re talking about a world that’s very biassed very discriminatory against people on the move. And the datasets that underpin AI and algorithmic decision making are also very biassed. Well, it makes sense, right? That it would be just as biassed or biassed in different ways than a human officer might be. And I’m not saying human decision making is perfect either, right? As someone who used to be in court representing refugees, I know how bias judges are. But the answer to a broken system isn’t a bandaid solution that actually exacerbates biases even more. And I think that’s where my perspective on this kind of comes from right? I’m not a technologist. I’m someone who’s trying to understand this from a perspective of power. And technology exacerbates power differentials in society. And to me, it’s again, highlighted not to kind of keep bringing back like the Robo Dog example or the AI lie detector example. But I think they’re very instructive, right? Why are we developing AI lie detectors to test out on refugees, and not use AI to root out racist border guards or audit immigration decisions? That’s a clear set of priorities as well, that’s being pushed right by particular actors. And it’s about ultimately power, who’s around the decision making table and who gets to kind of determine what we innovate on? And why these are all they kind of break down along awful lines of bias and discrimination and who’s even allowed to be in the room in the first place?  

Robert Hanna 19:02 

Yeah, you make so many good, valuable points and things for us to really reflect and think about deeply. That leads on to what I want to ask you about the global rise of criminalization. creaming my nation virus. How was it blurring the lines between migration management and criminal law? Yeah, 

Petra Molnar 19:22 

this is this is a really important kind of point to keep in mind the fact that human movement and migration has been criminalised. And this is nothing new. But again, this assumption that people who are crossing borders are somehow a threat, or some group that must be kind of managed or made intelligible. And because they’re criminals unless proven otherwise. Right. But it’s completely upside down. It’s you know, what, sometimes we talk about kind of the reverse onus principle, right? Because in criminal law, at least in most jurisdictions, right, you’re innocent until proven guilty. But under refugee law, you’re not a refugee, unless proven otherwise. And it’s underpinned by so many problematic assumptions right about how a person who is fleeing violence is supposed to act, what certain push and pull factors are for migration, for example, you know, all of these kinds of ways that we look at the world in a box, I think this is why I really struggled, you know, being a litigator, because I never understood why we have to kind of stick to these really rigid categories, right? Like you’re a refugee, if you meet a five part definition. And if you don’t, and somehow you’re not eligible for protection, like why, you know, why is it so rigid? I understand we need a certain scaffolding in place as well. But what I see time and again, is that we’re just not able to think about the complexity of being a human in the world. And being a human on the move in the world in this complicated reality, especially. And what technology is doing is it’s again, creating more and more rigid boxes, right? That example of the AI lie detector, to me is so instructive. And for those of you listening, who might be immigration lawyers, or refugee lawyers, this might really ring true, because, you know, I’ve seen again, judges make super problematic decisions about someone’s plausibility, or whether they’re lying or not based on just their comportment or how they’re dressed or what they look like, right. And we know that human decision makers make really problematic assumptions. So what about AI? Like, how can AI lie detectors deal with differences in cross cultural communication? For example, I’ve worked with people who don’t make eye contact with a decision maker of the opposite gender, maybe because they’re nervous, or because of religious backgrounds or anything in between, right? Or what about the impact of trauma on memory, and the fact that we don’t tell stories in a linear way anyway, right. Human decision makers struggle with this. And the fact that we’re playing around with these high risk technologies that don’t actually really even get at the complexity of being human is very, very worrisome.  

Robert Hanna 21:51 

Yeah. And I want to kind of really pick up on this human side of things, because how can we humanise these issues specifically, by witnessing technologies and sharing stories from people on the move? 

Petra Molnar 22:03 

Yeah, the witnessing part, I think is definitely important kind of shining a light on the reality of what’s happening across the world borders. And these kinds of, again, these technological testing grounds that we’re seeing happen in migration. One, one way that I’ve tried to do it, to illuminate these issues for the public and for policymakers, is to centre human stories, because technology can seem so abstract sometimes. And we forget that there are real people and real communities that are at the sharpest edges of this tech. And so it’s about finding commonality, and highlighting the kind of human stakes and all of this. And that’s what my book tries to do not to just be like a legal text or, you know, a compendium of technology, but rather, each chapter talks about a particular context or a person, either on the move or people who are helping them as well, because it’s amazing to see the kind of solidarity networks that have grown up around migration in different borders. But I think it’s human stories that will really shift the dial on a lot of these issues. 

Robert Hanna 23:07 

Yeah, and so true. And that, again, kind of flows nicely on where to I want to sort of ask you next and also want to kind of pick up on your, your, your book in a moment, which is fantastic. And why is the necessity to redistribute resources to people on the move? And how can we support storytelling from the ground level up?  

Petra Molnar 23:25 

Yeah, you know that that’s the kind of ethos that we hold really, really dear at the migration technology monitor project. Again, instead of kind of hoarding resources and keeping them in academic institutions or think tanks, we want to move resources directly into people’s hands on the ground in communities on the move, so that they can be in the driver’s seat, determining what issues matter what we should be looking at having involvement in policy conversations in funding conversations, it really is about redistribution of resources, and giving people space and time to also do good work, right. That’s what our fellowship programme tries to do. Instead of giving small grants, we give people a living wage, you know, and we could have had like 2030 fellows, but we chose to have five properly funded colleagues. And ultimately, that’s what it’s about. It’s about working from a participatory way and a horizontal way where we are all equal, and are all colleagues in this space. Because that is really where I think innovative and new thinking on these issues can come from, because ultimately, that’s what it’s about, right? It’s about who actually is involved in the conversations around proposed interventions. Why is it always you know, folks, like in the private sector, or in Brussels or London or Washington behind closed doors, making these decisions that ultimately harm real people. It should be affected communities who are really leading the charge on this. 

Robert Hanna 24:46 

So you mentioned briefly before your book, which I’m a big fan of, but I want to ask a bit more around that the wolves have eyes surviving migration in the age of artificial intelligence, it was published earlier this year. What are some of the key themes you addressed in your book, and what do you hope for your readers to learn from diving into it? 

Petra Molnar 25:05 

Well, thanks for highlighting the book. Yeah, it’s definitely been a labour of love and pain for about six years. So it’s very bizarre to see it out there in the world, the I was just published a few weeks ago, and it’s not an academic book, it’s really meant more as a way to again, talk about these issues from a human perspective. And it kind of takes the reader through the different areas of the world where I’ve been working and meeting people and working directly with affected communities, you know, from US, Mexico, to different borders around Europe, to Israel, Palestine, to Kenya, to really try and take a global approach to these issues and grounded in human storytelling, both in terms of the impact, but again, in terms of the kinds of ways that we can also think about the world differently. And you know, a lot of it is a little bleak. And it can be a bit of a hard read not to dissuade readers. But it also ends on a hopeful note, you know, how do we actually build a different world? What are some of the strategies of resistance, and, and joy and creativity, because every single border where, you know, really difficult things happen, really good things happen to, for example, you know, you have 7080 year olds who are part of these search and rescue teams in Arizona, going to the desert and like literally shuffling along, sometimes doing water drops for people, where you have amazing lawyers in Greece supporting asylum seekers, and people fighting against digital oppression around the world. And that, to me, really highlights again, the kind of human stakes that instead of dividing ourselves, we actually could come together and work towards a different world. Yeah. 

Robert Hanna 26:41 

And as a mentor said, to me, we is greater than me. And yeah, I love that you talk about the positivity as well in terms of I can hear from the energy and the passion in your voice in terms of really trying to, you know, make the change. And as I say, I would encourage people to definitely go and check out your, your book, I want to talk about your faculty associate position at Burke Kline Centre for Internet Society at Harvard University. Could you tell us more about your involvement with the centre? 

Petra Molnar 27:07 

Sure, yeah. So I’ve been associated with them for just about two years now, I was invited to go as a fellow to have a little bit of a break from some of the on the ground work across all these different borders, because it also does take its toll. And I was finishing up the book. And I was part of this amazing cohort of people working on all sorts of kind of tech and society issues. And then he invited me to stay on as a faculty associate to kind of help incubate the work and, and foster connections. And it’s been a really fascinating place, because it’s incredibly multidisciplinary, right? Like you have people working on social media and surveillance and digital rights, broadly speaking, and it’s it’s just been wonderful to continually learn from from each other in this really dynamic and global community, too.  

Robert Hanna 27:57 

You know, and it’s a great theme of the discussion today that the sort of power of community and it’s and it’s, again, it’s something that I’m really passionate about. Let’s talk about one of your other roles as well, because you keep so busy, you’re also an associate director of the refugee law lab at York University. So what initiatives are law lab currently working on? 

Petra Molnar 28:16 

Oh, yeah, I wear many hats. So our refugee law lab is actually kind of like my main academic home. And that’s where we run the migration and technology monitor project out of, and it’s a really interesting little lab that, you know, I’ve been really, really lucky to be involved with, for a few years now, I guess, four years already. And we kind of launched it as a home for a lot of the work at the intersection of migration and tech, I do more of the kind of surveillance and sharp edges and the kind of human rights impacts of border technologies. But there’s this whole other half to a to that might be of interest to your audience to kind of looking at legal technologies as a way to upskill refugee lawyers and immigration lawyers, and to again, also kind of level the playing field between, you know, people who are representing asylum seekers, and the government, for example, that has access to a lot more technology. So my colleague, Sean rehab, who’s like the other half of the lab, he does amazing work on kind of big data analytics, for example, taking a look at vast data sets of refugee cases in Canada, and coming up with really helpful ways for refugee lawyers to learn about how better to prepare for example, for particular cases, and just really innovatively thinking about how technology can also be a useful tool for those of us who are practising law. So we kind of try and look at borders and technology and migration from these two sides. And it’s been really a lovely space to to work and incubate a lot of these projects. Yeah. 

Robert Hanna 29:43 

And I’m just amazed how you fit it all in and all the amazing work that you you do. I can’t believe we’ve managed to get through so much in such a short space and time. But I want to ask one final question is what advice would you give for those interested in my Gration and human rights law as a career off yeah, 

Petra Molnar 30:03 

that’s that’s a great question to end on. I mean, I think my one piece of advice would be to lead with your heart. I know that sounds a bit cheesy, but it’s not an easy area to work in, right because you see injustice and systemic injustice every day. And it can be very difficult to know how to kind of make your way through this world that seems to be so sharply position against the people that you’re working with. But what keeps me going is knowing that there are amazing people who are so committed and so passionate about helping people all around the world, that it really kind of galvanises me to to keep going. Because we need good people in this area for sure. You know, it’s definitely not a lucrative area to work in or anything like that. But it gives you so much meaning and so much kind of joyful resistance to the way things are as well. I think there are a lot of dreamers and schemers in this area of law. And it’s definitely a very rewarding place to work.  

Robert Hanna 31:03 

Yeah, and you’re absolutely leading the way in terms of inspiring and pushing for change and all the wonderful work that you’re doing. And I’m sure our listeners will if they’d like to learn more about your career, the migration and technology, monitor the book, and then climb Centre for Internet and Society at Harvard University, or indeed the refugee law lab at your university. Where can they find out more? Feel free to shout out any websites, any social media handle links? And we’ll also share them with this episode for you too. Oh, sure. 

Petra Molnar 31:28 

Yeah, I mean, you can probably start with my website, which is just my name Petro molnar.com. Or you can follow me on Twitter. Or you can check out our migration technology monitor platform, which is migration technology monitor.com in different languages. Definitely. That’s where a lot of this work lives. 

Robert Hanna 31:46 

Fabulous. Well, look, it’s been an absolute pleasure having you on the legally speaking podcast, Petra, I thoroughly enjoyed learning more about your career and all the great work that you’re doing and just the power of community that you’re building as well. But from all of us now, on the legally speaking podcast, wishing you lots of continued success, and but for now, over and out.  

Enjoy the Podcast?

You may also tune in on Goodpods, Apple Podcasts, Spotify, or wherever you get your podcasts!

Give us a follow on X, Instagram, LinkedIn, TikTok and Youtube.

Finally, support us with BuyMeACoffee.

🎙 Don’t forget to join our Legally Speaking Club Community where we connect with like-minded people, share resources, and continue the conversation from this episode.

Subscribe to Our Newsletter.

Sponsored by Clio – the #1 legal software for clients, cases, billing and more!

💻  www.legallyspeakingpodcast.com

📧  info@legallyspeakingpodcast.com

Disclaimer: All episodes are recorded at certain moments in time and reflect those moments only.

Facebook
Twitter
LinkedIn

👇 Wish To Support Us? 👇

Buy Me a Coffee

Leave a Reply

Recent Posts