I Use This!
Very High Activity


Analyzed 22 days ago. based on code collected 25 days ago.
Posted 4 months ago by Heather West
We live so much of our lives online. Building a healthier internet is part of protecting our way of life, and is central to Mozilla’s mission. But we can’t protect the Internet alone – it’s a shared responsibility. Participating in conversations with ... [More] all the stakeholders allows us to learn from others in the field and to share the Mozilla perspective. In our ongoing efforts to make the internet safer, Firefox Security Lead Richard Barnes will be speaking on a panel at Stanford Law School’s February 2 event “Government Hacking: Assessing and Mitigating the Security Risk.” To attend in person, RSVP here. We’ll also recap it here on the blog. This continues in the theme of several of the panels I participated in late last year. I discussed the future of cybersecurity and internet privacy with industry leaders late last year – see below to read excerpts and watch the videos, and let us know what you think! As part of the Coalition for Cybersecurity Policy & Law, I went to a day-long symposium, “Cybersecurity Under the Next President.” I discussed the process by which the government decides if and when to disclose security vulnerabilities. This is known as the vulnerabilities equities process, or VEP, and it is an important part of Mozilla’s work toward a secure internet due to the lack of government transparency about its use. On this panel, I spoke about reforms the government could take to improve the current vulnerabilities equities process. “In a perfect world I would like this process to be robust – and that may mean a legislative solution such that they have to undertake this process and they have to have certain interests at the table when they consider a given vulnerability. I want them to have a timeline and a process set out that helps us understand how long it takes to get from discovery or acquisition, to consideration to disclosure or nondisclosure. We want independent oversight and transparency to the process… into how it works and how the disclosure is handled. We want to make sure that civilian agencies whose mission is to create trust, secure the internet and secure the American people are involved and engaged in this process. Those steps would significantly increase trust. Making sure that everything goes through the Vulnerabilities Equities Process would be very helpful.” Video from this panel can be found here. The next day, I joined a panel of academics and policy experts at the Center for Internet and Society at Stanford Law to address how government and industry can work together to strengthen the process and discuss varied perspectives. At this event, part of the series co-hosted by Mozilla, I joined experts to explain the biggest problems with the current vulnerabilities equities process. “It only sees a small fraction or some fraction of the vulnerabilities held by the government. Specifically as we move into a connected world – the internet of things – more agencies are going to come into contact with more exploits.” That’s why Mozilla believes it’s essential for the government to codify the use of the vulnerabilities equities process. “If we can make this go across the government — make it broadly used, that would be a significant step forward. Of course we would have to adequately resource that.” To watch the video of the panel, visit https://www.youtube.com/watch?v=lTwct5qMKC8.   [Less]
Posted 4 months ago
I’ve written before on this blog about my current project with Mozilla’s Connected Devices group: Project Haiku. Last week, after close to 9 months of exploration, prototyping and refinement this project was put on hold indefinitely. So I wanted to ... [More] take this opportunity - a brief lull before I get caught up in my next project - to reflect on the work and many ideas that Project Haiku produced. There are several angles to look at it from, so I’ll break it down into separate blog posts. In this post I’ll provide a background of the what, when and why as a simple chronological story of the project from start to finish. Phase 0: Are we solving the right problem?Back in March 2016, with Firefox OS winding down and most of that team off exploring the field of IoT and the smart home, Liz proposed a vision for a project that would tackle smart home problems in a way that was more grounded in human experience and recognized the diversity of our requirements from technology and our need to have it reflect our values - both aesthetically and practically. I had been experimenting with ideas like the smart mirror and this human-centric direction resonated with me. A team gathered around her proposal and we started digging. It quickly became clear that the “smart home” box wasn’t a useful constraint. Connecting things around the home in a way that felt valuable and reflective of the principles we’d identified for this project was proving elusive. So we stepped back and did some design thinking: are we asking the right question? What do people really want from technology in the context of the home? And which people are we talking about? This lead us to a study in which we interviewed a set of teens and retirement-age folks on themes of freedom and independence in the home. You can find more details on the study here Connecting peopleOf the themes that emerged from this study, we chose to focus on that of connecting people. We saw the same needs repeated over and over: people wanted to share moments, to maintain a presence in each others lives. At the same time there was a sense of loss of control and growing obligation from smart phones and social media; being spread too thin. Over the next few months, we built test devices to better understand this problem, and conducted further studies eventually arriving at a simple wearable device that would show real-time status for a small group of friends and family. We were happy to see that a few other companies had arrived as similar conclusions - taking their own journey to get to this point. Products like Ringly and the Goodnight Lamp embodied some of the same thinking. Our idea for a wearable product was very much informed by Mozilla’s ethos and mission. In this simple device we were going to implement what amounted to a simple wearable web client capable of monitoring a handful of URLs and “displaying” the changing values supplied by those endpoints as visual light patterns and haptic feedback. We would bring the https://www.mozilla.org/about/manifesto/ to the world of connected wearables, and bring both peace of mind and small moments of joy to young people at a time when many in the industry seem intent on exploiting their Fear of Missing Out, and are sometimes cavalier in their handling of privacy and data ownership. Stumbling and a change in directionGetting to grips with what it would take to produce this device and re-building momentum lost over the summer break had cost us though. Just as this picture came into focus and we started to take the next steps in the plan, the team was called to account. Our enthusiasm and confidence in the product was not shared by the innovation board. There was some skepticism of our premise - that our audience of teenage girls would want such a thing - despite the research we had done. And there were concerns about our ability to contain the cost and complexity implicit in the small, wearable form-factor. Given the finite resources available to the Connected Devices group and the ambitions of our project relative to the experience and expertise available to us, from the outside it looked like we were heading off into the weeds. At the same time, another team had concluded a exploratory project with an outside agency and had produced a report echoing many of the needs and values Haiku had identified. They had proposed a (non-mobile) device for the home which would facilitate communication and sharing between friends and family. We decided to put aside the wearable and pick up where this report left off. I’ve written already about some of this work. We produced a “learning prototype” to home in some more on what people wanted from a device like this, and where we could have the most impact. We adopted a new target audience and use case - communication between kids and grandparents - and assessed priorities and features. We did some technical exploration and landed on what was essentially a WebRTC application, running on an embedded Linux device. The WebRTC architecture was a great fit: private and secure by default, with no need to store or pass personal communications through Mozilla’s servers. Each connection is point to point and the very personal and private content implicit in the use cases would always be encrypted. With little to no data to store, an open-source codebase for client and services and a minimum of setup, we could minimize the risk/threat of lock-in for the device owner. Meanwhile, we had questions. How might this device be used? What kinds of messages would these people want to send? Should we store missed messages? Is the device portable or not? We knocked together another prototype, this time using left-over phones from FxOS days to gather data and feedback from a set of grandparents and grandchildren over a couple of week-long studies. Fleshing out the ideaThe culmination of this work was a product definition that included the user market and use cases, the features and principles, as well as details on what we would need to implement and how. We had landed on a concept for a device and service that would give grandparents and their grandchildren an easy, one-touch experience to share moments using audio or emoji. The child would have a dedicated connected device, explicitly and exclusively paired to an app installed on the grandparent’s phone. We observed a magical thing emerging from the simplicity and directness of the experience: kids were able to carry out “conversations” without any assistance from their parents; they could own their relationship with distant loved ones. This was the real value proposition. Project Haiku wasn’t presenting a technical breakthrough as-such, but taking existing technology and fostering joy, confidence and agency using open standards and the infrastructure of the web. The process we follow in Connected Devices has a “Gate 1” milestone in which for a project to move forward, it should present a clear picture of what the product will be, demonstrate viability and a market fit, and detail what it will take to get there. It is evaluated against these and other criteria including alignment with the Mozilla mission, and alignment with the collective vision for Connected Devices. In December we presented to the board and found out later that week that we had met the criteria and passed Gate 1. However… Back-burneredThe “however” was about resources and priorities: people, money and time. We simply couldn’t pursue each of the products at this time. In the context of the emerging game plan for Connected Devices, Haiku was not a high priority and other projects that were, were hurting for lack of people to work on them. So Project Haiku is on the back-burner. Its possible though unlikely that we’ll be able to revisit it and pick development back up later this year. In the meantime, the best we can do is to ensure that work and findings from this project are well documented so the organization and the community have the opportunity to learn what we learned. To that end, I’ll be putting my thoughts to paper on this blog on a series of topics which Project Haiku touched. As usual with Mozilla, all our code and documents are publicly available. Please find me on IRC in the #haiku channel (irc.mozilla.com) as sfoster, or through my mozilla or personal email (sfoster at mozilla, sam at sam-i-am.com) if you have any questions. I’m also on twitter etc. as samfosteriam [Less]
Posted 4 months ago by Air Mozilla
The Monday Project Meeting
Posted 4 months ago by gerv
Like every year for the past ten or more (except for a couple of years when my wife was due to have a baby), I’ll be going to FOSDEM, the premier European grass-roots FLOSS conference. This year, I’m speaking on the Policy and Legal Issues track ... [More] , with the title “Reflections on Adjusting Trust: Tales of running an open and transparent Certificate Authority Program“. The talk is on Sunday at 12.40pm in the Legal and Policy Issues devroom (H.1301), and I’ll be talking about how we use the Mozilla root program to improve the state of security and encryption on the Internet, and the various CA misdemeanours we have found along the way. Hope to see you there :-) Note that the Legal and Policy Issues devroom is usually scarily popular; arrive early if you want to get inside. [Less]
Posted 4 months ago by Vnisha Srivastav
DinoTank is an internal pitch platform in order to innovate and solve current issues that are relevant to Mozilla’s mission. DinoTank 2016, came with a twist — instead of pitching ideas we focused on pitching problem statements.To give each DinoTank ... [More] winner the best possible start, we set up a design sprint for each one. This is the second blog post from the series of DinoTank sprints by Vnisha Srivastav, volunteer in the Mozilla community.Nearly 70% of Indian population resides in rural areas and about 47% of this rural population constitutes women. These rural women, forming about 1/3rd of India’s population, play the key role in social changes required for economic development and overall well-being.Coming from a rural area in Uttar Pradesh and having seen the life of women there, I am a testimony to the sorry state of rural women in India — young girls dropping out from schools after primary education, getting embroiled in marriage and family responsibilities, families being apprehensive about girls using mobiles and the internet. Probably, these are the reasons for the biggest online gender disparity in India; only 2% of internet users in rural India are women.Being picked by the DinoTank jury I got the exciting opportunity to get to gather a group of designers, engineers, social scientist and activists in Bangalore for a two-day sprint, to work in-depth on this problem.Getting to understand the problemIn order to deeply understand the realities on the ground and gain perspectives, we partnered with Quest Alliance and interviewed a series of young women who had embraced opportunities and leapt out of the rural areas surrounding Bangalore, to study technology and media. Through focus group sessions and interviews we learned about the hurdles and barriers they had to overcome to get access to education and technology. They talked about their own viewpoint on their situation and what helped them. From this we extracted a series of insights that helped us understand the problem in more detail.DinoTank Insights “Connecting Rural Women on the Internet”Moving from insights to ideationOnce we understood the problems better, we moved to think about what could be done to help solve them. Using the Creative Matrix methodology in order to have a structured ideation process, the participants were challenged to think outside the box and in unconventional directions.While ideation was great, it was now time to narrow the outcome further down: we took our wall of ideas and went from over 100 to the strongest 3–4 which we particularly liked. We then started discussing them and creating storyboards.These compelling user focused narratives allowed us to really explore the ideas as end-to-end solutions and helped us focus on user experience and how they could work in real life.Paper PrototypingLastly, we were tasked with creating Paper Prototypes of our ideas. This allowed us to mimic an experience, veryearly on at idea phase.To identify what part of the experience to paper prototype, we looked for the critical point of failure. We asked ourselves, what in this concept/idea we were most uncertain of? What in the experience did we think people may not get? What did we feel we needed validation on?These paper prototypes helped us mock-test our ideas and really showed, what worked and what didn’t.What I learnedOne of the things that impressed me was that facilitators and my co-participants listened intently and offered sincere suggestions for improvement on the project, rather than politely acknowledging the discussion.Also, I learned how to ask the right questions and how to present the research effectively. The design sprint was a hands-on, thought-provoking program that left me inspired. Feeling much more influential with many actionable take-aways I can apply immediately for my problem statement. The information and acquired skills learned in this sprint will greatly help me achieve my goals in taking this project forward.Moving ForwardWith the basic experiment in place, the next steps are: Testing the prototypes in real settings Building resources to further work on this project Looking for collaborations Watch this space!I would like to sincerely thank Kotresh HB, Rina Jensen, Michael Henretty, Rosana Ardila, Subhashish Panigrahi and all the people present in the workshop for their contributions in making this effort successful.Further readingIf you want to know more about what we did, what we developed and what we are doing, please read our full report.Furthermore, several participants blogged about the event: Shreyas Narayanan — My AHA moment Chaithanya Krishnan — DinoTank- Design Sprint 2016 Subhasis Chatterjee — When Design Thinking Packs a Punch Vigneshwer Dhinakaran — Solving the Problem of Connecting Rural Women Bhuvana Meenakshi — The Versatile Oomph Connect Rural Women on the Internet was originally published in Mozilla Open Innovation on Medium, where people are continuing the conversation by highlighting and responding to this story. [Less]
Posted 4 months ago by mkohler
In the past months we have worked to implement pieces of RepsNext. Now it’s time to share an update on where we stand and what has been done so far. Our work in this quarter will also be focused on aligning with other parts of the organization and ... [More] make use of existing resources to further implement RepsNext. We also have published the Reps program goals for the current quarter. Thanks to Yofie for creating this visualization! (minor last-minute changes made by Reps Council) Resources We have made significant progress in the last quarter on the Resources track. We now have a Review Team that has managed to bring the average review time down by 30% in the last quarter. We also have allocated our budget based on Participation’s priorities for the quarter and we have published it with a live update so Reps know exactly where we spend our resources. Last but not least, we now have a definition of what a Resources Rep is, as well as a dedicated wiki for the Resources track where we track all our processes. But there is always room for more. This quarter we will focus on finishing the training and the application process so Reps can join the track. If you want to help on our work, please go ahead and read the draft training and provide your feedback. We truly value them! Onboarding During the past few months we’ve seen a problem on onboarding new Reps, mainly due to the lack of new mentors to take them as well as a huge amount of applications we had to screen through. After a mass screening phase we informed the accepted applicants and asked them to confirm their interest to join the program. These will onboarded as a test for our new onboarding webinar we’re creating. This new webinar is currently being drafted and will be finished soon. We have a few tweaks to make in terms of the content itself, but the general content ideas stand. If the first test with this new webinar is successful, we can improve the Reps-specific onboarding time significantly. Additionally we will be working together with the Community Development team in this quarter to analyze current onboarding processes to identify common parts (including Reps and Campus Clubs). Participation Alignment Along with the Participation team we’ve worked a lot in order to align our goals with the team’s goals. The Council is working with the team in order to co-create the quarterly and yearly goals and OKRs for 2017. The program’s goals are also being created based the team’s goals and priorities. Of course this is an ongoing work that will continue this year. In order for the program to be successful we need to be able to be aligned with the team’s goals as well as the broader Mozilla’s goals. The Reps Council will be highly involved in strategic and operational discussions as representatives for the broader community, this also means that at least one Council member will be attending regular Community Development/Open Innovation meetings. Leadership During the last quarter a team of volunteers lead by Emma Irwin built and tested the leadership toolkit. This toolkit will act as the guidance for the Reps that want to join the track. The following months we will work on creating a solid roadmap for the track, on how people can join and how we can align our resources with the Leadership Network resources from MoFo. Coaching Last fall we have onboarded new coaches in the Reps program to strengthen our coaching possibilities for new Reps. All of these were already Reps. Guillermo lead the efforts of this training and has created training material we can use for further coaching trainings. We also ran the training with a few existing mentors to test the training material. Additionally we have started with a Regional Coaches group. The Reps Regional coaches project aims to bring support to all Mozilla local communities around the world thanks to a group of excellent core contributors who will be talking with these communities and coordinating with the Reps program and the Participation team. These coaches are neither a power structure nor a decision maker, they are there to listen to the communities and establish a 2-way communication. We want communities to be better integrated with the rest of the org, not just to be aligned with the current organizational needs but also to allow them to be more involved in shaping the strategy and vision for Mozilla and work together with staff as a team, as One Mozilla. In this quarter we will finalize our plan on how we will handle coaching in the future. We encountered a few challenges in the past few months which we will solve in this plan. Our goal is to further improve this coaching material so it can serve for training new coaches as well as training existing mentors. Our goal is to have all existing mentors to be trained with this material once we have a solid plan. We are also thinking about renewing existing mentor’s commitment. All of this will ensure that all Reps can grow and advance in personal skills and their volunteer goals with the help of their coaches in addition to the leadership track. Functional areas While the question about functional areas come up from time to time during our “Working Groups” phase of RepsNext, we never had a dedicated group for it. Therefore there is no solid proposal on how to move forward there. We know that this would involve a lot of time commitment from both the functional teams’ sides as well as ours. This is currently not realistic to implement or analyze. Reps are encouraged to build and development communities around functional areas with direct input from functional teams, but we will not focus on this part for at least the first half of this year. We need to have a strong base as a mobilizer program first. You can follow all the Reps program’s goals in the Reps Issue Tracker. Which thoughts cross your mind upon reading this? Where would you like to help out? Let’s keep the conversation going! Join the discussion on Discourse. [Less]
Posted 4 months ago
I’ve been a user of the Web for around 21 years. Although it’s difficult to remember, I’m pretty sure ‘tabbed browsing’ pre-dates my first use of the Web. I can certainly remember in Microsoft Internet Explorer having to open a new window every ... [More] time I wanted to visit a different website. It was one of the reasons I liked Netscape Navigator, later moving seamlessly to Mozilla Firefox. While there’s been all kinds of wonderful innovation on the web, there doesn’t seem to have been as much innovation in tabbed browsing. Granted, you can mute certain tabs, pin them, and close all but the one you’re on. But, fundamentally, other than Tree Style Tab and the slightly unintuitive Tab Groups, tabbed browsing doesn’t feel much different than it was 20 years ago. In a recent blog post I came across via Medium, Patryk Adaś made me aware of a Mozilla project that is focused on “evolving the standard tabbed browser towards a model based on trails”. It’s an interesting concept, shown visually in this 14-second video that demonstrates how it works: The example used, of someone ‘deciding on a pizza joint’ is trivial, but I’m particularly interested in this from a new literacies point of view. Given that we’re at a time when we can’t necessarily trust information that comes from a particular domain (I’m looking at you, whitehouse.gov) something that shows the trail people took to get to a website they trust would be a valuable tool. Mozilla has a habit at the moment of shutting things down, in the hunt for ‘scale’. I hope this particular project sees the light of day, and I get to both use this myself and demonstrate it to others. Comments? Questions? I’m @dajbelshaw on Twitter, or you can email me: hello@dynamicskillset.com [Less]
Posted 4 months ago by standard8
As Jared has been posting, we have gradually been enabling ESLint rules for the Firefox code base. We’ve created a page on devmo for ESLint help & hints. In particular, there’s links to details on how to integrate it into your editor and also ... [More] hints and tips for fixing issues. If you have questions or comments, feel free to join us in the #eslint channel on IRC. [Less]
Posted 4 months ago by Daniel Stenberg
Following up on the problem with our current lack of a universal URL standard that I blogged about in May 2016: My URL isn’t your URL. I want a single, unified URL standard that we would all stand behind, support and adhere to. What triggers me this ... [More] time, is yet another issue. A friendly curl user sent me this URL: http://user@example.com:80@daniel.haxx.se … and pasting this URL into different tools and browsers show that there’s not a wide agreement on how this should work. Is the URL legal in the first place and if so, which host should a client contact? curl treats the ‘@’-character as a separator between userinfo and host name so ‘example.com’ becomes the host name, the port number is 80 followed by rubbish that curl ignores. (wget2, the next-gen wget that’s in development works identically) wget extracts the example.com host name but rejects the port number due to the rubbish after the zero. Edge and Safari say the URL is invalid and don’t go anywhere Firefox and Chrome allow ‘@’ as part of the userinfo, take the ’80’ as a password and the host name then becomes ‘daniel.haxx.se’ The only somewhat modern “spec” for URLs is the WHATWG URL specification. The other major, but now somewhat aged, URL spec is RFC 3986, made by the IETF and published in 2005. In 2015, URL problem statement and directions was published as an Internet-draft by Masinter and Ruby and it brings up most of the current URL spec problems. Some of them are also discussed in Ruby’s WHATWG URL vs IETF URI post from 2014. What I would like to see happen… Which group? A group! Friends I know in the WHATWG suggest that I should dig in there and help them improve their spec. That would be a good idea if fixing the WHATWG spec would be the ultimate goal. I don’t think it is enough. The WHATWG is highly browser focused and my interactions with members of that group that I have had in the past, have shown that there is little sympathy there for non-browsers who want to deal with URLs and there is even less sympathy or interest for URL schemes that the popular browsers don’t even support or care about. URLs cover much more than HTTP(S). I have the feeling that WHATWG people would not like this work to be done within the IETF and vice versa. Since I’d like buy-in from both camps, and any other camps that might have an interest in URLs, this would need to be handled somehow. It would also be great to get other major URL “consumers” on board, like authors of popular URL parsing libraries, tools and components. Such a URL group would of course have to agree on the goal and how to get there, but I’ll still provide some additional things I want to see. Update: I want to emphasize that I do not consider the WHATWG’s job bad, wrong or lost. I think they’ve done a great job at unifying browsers’ treatment of URLs. I don’t mean to belittle that. I just know that this group is only a small subset of the people who probably should be involved in a unified URL standard. A single fixed spec I can’t see any compelling reasons why a URL specification couldn’t reach a stable state and get published as *the* URL standard. The “living standard” approach may be fine for certain things (and in particular browsers that update every six weeks), but URLs are supposed to be long-lived and inter-operate far into the future so they really really should not change. Therefore, I think the IETF documentation model could work well for this. The WHATWG spec documents what browsers do, and browsers do what is documented. At least that’s the theory I’ve been told, and it causes a spinning and never-ending loop that goes against my wish. Document the format The WHATWG specification is written in a pseudo code style, describing how a parser would “walk” over the string with a state machine and all. I know some people like that, I find it utterly annoying and really hard to figure out what’s allowed or not. I much more prefer the regular RFC style of describing protocol syntax. IDNA Can we please just say that host names in URLs should be handled according to IDNA2008 (RFC 5895)? WHATWG URL doesn’t state any IDNA spec number at all. Move out irrelevant sections “Irrelevant” when it comes to documenting the URL format that is. The WHATWG details several things that are related to URL for browsers but are mostly irrelevant to other URL consumers or producers. Like section “5. application/x-www-form-urlencoded” and “6. API”. They would be better placed in a “URL considerations for browsers” companion document. Working doesn’t imply sensible So browsers accept URLs written with thousands of forward slashes instead of two. That is not a good reason for the spec to say that a URL may legitimately contain a thousand slashes. I’m totally convinced there’s no critical content anywhere using such formatted URLs and no soul will be sad if we’d restricted the number to a single-digit. So we should. And yeah, then browsers should reject URLs using more. The slashes are only an example. The browsers have used a “liberal in what you accept” policy for a lot of things since forever, but we must resist to use that as a basis when nailing down a standard. The odds of this happening soon? I know there are individuals interested in seeing the URL situation getting worked on. We’ve seen articles and internet-drafts posted on the issue several times the last few years. Any year now I think we will see some movement for real trying to fix this. I hope I will manage to participate and contribute a little from my end. [Less]
Posted 4 months ago by Karl Dubost
Can I travel to a work week meeting without any data on my laptop, aka a clean install? It has been quite a couple of years that I have the same running question. And this week, international news led me to rethink about it. Mobile phone for me is ... [More] easy, I have been living without it for years (since 2001 exactly). Social media accounts? I don't have a twitter account anymore, never had a facebook account, killed Flickr years ago when they were bought by Yahoo! If I'm traveling anywhere in the world for work, do I need to have data on my laptop. When I say no data, I really mean it. It means you erase the laptop and makes it like if it was coming out of the factory. No login, no password, no mail configuration. So I'm thinking about having a laptop specifically for traveling with a clean install and wondering what would be the consequences for working. Be for myself and my team mates. The great thing about working on opensource projects and archiving everything online is that I could still work on some things without having to carry a single ssh passphrase or passwords with me. I can still do git clone or hg pull. I can still prepare work. I can do local commits on a laptop. I can write documents or email drafts. I can take notes during discussions. I can read old threads and documentation on wiki. Some of the issues that needs buffering and delay: Commenting on bugs Sending emails Pushing code Probably other things Once the work week is finished, I can come back to my home location with the work produced during the week on this laptop but still having no personal data. Just the work produced during the week. And then send all the work and push the commits once I'm back home. It seems doable and a minor disturbance compared to the risks of uncertainties of privacy breach of your laptop searched against your will. Otsukare! [Less]