-
Notifications
You must be signed in to change notification settings - Fork 69
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add position on WebUSB. #193
Conversation
There's concerns with SOP violations as well, if the USB device permits the ability to read data. |
If the USB device can read data from what? system memory? |
I was thinking more like:
This had previously come in FIDO discussions, where existing PIV identity cards were proposed, and disliked, because they had no concept of 'origins' or individual identities per origin. |
@tomrittervg raises an interesting point. If there are properties of the device that are stable across origins and sufficiently unique, or an ability to affect a stable change to a device, then tracking is enabled. But we don't need more than one reason to label something as "harmful" and a full dissertation on the ways in which something is "harmful" is not necessary. |
It may be worth listing all our reasons if we think people will try to change the proposal to avoid the harms we have identified.... |
Please comment on the issue then with this argument. I don't want to lose it and only the issue will remain linked once this PR is merged. |
If you want cases FOR the ability to take advantage of WebUSB, the MicroBlocks tool for programming microcontrollers via a web browser would be at the top of my list. The current MicroBlocks IDE is installed natively per OS. It works by having the IDE program "blocks" definitions that run on a "virtual machine" on the MCU board. Communications works via serial over USB. I think "Web Serial" commands would be sufficient. Is there a safe mechanism that can be implemented so that the main MicroBlocks IDE can run in a browser, and communicate via serial protocol over USB to the MCU board?
|
Co-Authored-By: Martin Thomson <[email protected]>
I've revised this to add a sentence mentioning tracking issues. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think that we should merge this, but be prepared to update it as we learn more.
This issue isn't solely about "sensitive" hardware, but in any case, the specification does not provide technical mechanisms to protect the user beyond a permissions prompt. An arduino is actually a good case of a device which is dangerous, because it is subject to remote firmware installs. |
The Chrome team has made a different cost/benefit analysis than we did.
Actually, I am aware of at least one vulnerability that is a result of this API. https://www.yubico.com/2018/06/webusb-and-responsible-disclosure/ |
Large vendor tools like ARM Keil Web IDE rely on WebUSB. The discussion here seems to mostly be around vague concerns that have not been proven to be problematic. Updating firmware of devices is exactly one big usecase of WebUSB. |
The whole point I believe is to think ahead and consider the problems before they become problematic. We don't need ActiveX, Flash, ...name a bunch of other security nightmares that were OK'd because they are used by or come from big business all over again. Rather than saying this must be OK'd because ARM have a Web IDE, maybe ARM should be considering going about things in a different way. I'm only lightly knowledgeable on this subject since finding Firefox not having WebHID stopped me from doing something and alerted me to the issue. |
To raise concerns you have to present scenarios that are based on an actual technical underpinning. The scenarios that are being outlined always revolve around very unspecific extremely broad scenarios that are impossible to discuss in any meaningful way because they leave so much room for interpretation. Instead of talking about what exactly the actual technical risks are the bottom line is always revolving around "USB dangerous" and then thats the end of the discussion. This discussion has been going on for over 7 years now, this is not a new standard this is a very much established standard by now and all I can do is tell my users to not use Firefox because any answer to the raised concerns seems to be either ignored or waived off. It does not seem possible to have a productive discussion with Mozilla on this topic at the moment. Every single issue brought up in this discussion so far has very reasonable and good ways of mitigating risks but I have yet to see people fundamentally opposing WebUSB to engage with those counter points in good faith. This is both frustrating but also pushes Chromium even further into a role that dominates the market. No one is saying to just hand waivingly implement WebUSB with no regard to security. What hardware developers like myself are asking is to engage the topic seriously and with an open mind instead of shutting down the discussion every time it is brought up because minds seem to be made up before the discussion is even had. |
How do you address the issue with the server side being compromised, and serving code that can damage users devices? |
I would first look at what the current situation and risk profile for a USB device is and then talk about if any of those factors would become worse via this new form of access. In this realm absolutely any tool the user installs can pose a risk to a USB device if it has that intention. The hurdle for a user to install something means clicking a download, opening the installer, they get presented by warning dialog by default in Firefox that downloaded software can be dangerous. They have to acknowledge this and continue. They have no way of knowing if the binary they download has been compromised or not. They just have to reasonably assume this is not the case. The main difference is that a user would download the program for interacting with their device a lot more often if they are dealing with a web based tool. So the exposure potential is certainly a bit higher as native tools tend to update a little less frequently but still, a native tool is also prone to a compromised server any time it runs an update. That is an inherent risk with any piece of software. Now for the actual risk described here, namely damaging a device with code. USB has by design no way of knowing if it interacts with a malicious piece of code or not. It is in the hardware designers responsibility that data coming via USB is verified. This is the case no matter in what environment the host side code is executed. Accessing a USB device over a web browser brings some security advantages here. The website has to be given access to a specific USB device by the user versus being able to access absolutely any device present on the system. This reduces the risk of exposure for having malicious software scan for devices it deems a target that might not relate to the device the website is normally built for. This does not eliminate risk but it mitigates it to a reasonable degree in my opinion. Very clear and very explicit modals that inform the user and force the user to make conscious choices are such mitigations and I'm certainly in favour of making these modals much more present and disruptive than the ones currently implemented for audio and video. At the end of the day this and quite few more of the security risks raised are inherent to USB itself. Which does not mean they are to be discarded but we can't change how USB works, that is a separate discussion and is not a specific risk to WebUSB vs native USB. The question is, is it better for the user to have them interact with their devices via downloaded piece of software running fully privileged or is it better to sandbox the code to being able to only interact with nothing but USB and only devices approved by the user anytime an interaction in a new session is needed. |
You also need to look at the guarantees currently offered by the Web as a platform and figure out if this kind of proposal is going to make the web look - basically - as unsafe as other platforms. Historically the web gives you peace of mind that navigating to a web site won't cause issues. |
WebMIDI had the same problem, and Mozilla has shipped it not so long ago. Clearly there is a specific kind of consent flow past which hardware damage is considered acceptable. Is the issue of device damage something that needs to be addressed for WebUSB at all, given that it is self-evidently okay for WebMIDI? (This isn't strictly on topic, but WebSerial has been rejected with the same justification, while it is--even more so than WebUSB, see below--essentially equivalent to WebMIDI with SysEx, and just like with WebMIDI, you can already install an extension that adds the WebSerial API to Firefox.) One thing that WebUSB trivially allows that neither WebMIDI nor WebSerial do is the use of the USB devices to attack the host. For example, any device based on the Cypress FX2LP ASIC (of which there are likely billions, if not more) has a hardware, impossible to disable, impossible to detect during enumeration implementation of a vendor-specific control request 0xA0 that performs (by design) arbitrary code execution. If a user grants consent to a webpage for the use of such a device, the webpage can then trivially elevate its privileges to that of the terminal session by reprogramming the device so that it disconnects and re-enumerate as a keyboard, then types The consent flow for WebUSB has to take this into account, and I also think it should make this point more firmly than the consent flow for WebSerial or WebMIDI. If you're using WebSerial then potential device damage is almost certainly an understood and accepted consequence of normal, non-malicious use; if you are reflashing a bootloader or driving a 3D printer, it is fully expected that the device may break. WebMIDI has a consent flow warning you about it potentially installing malware, which is quite unlikely, but device damage is definitely possible, and in some cases (intentional firmware upgrade) must be expected. WebUSB differs from both of those in that it can actually, and often very easily, and using only documented interfaces install malware on the host. |
It turns out that WebMIDI is being used as an experiment to determine whether other negative positions should be reversed: #720. That's a very reasonable position! |
Providing a much more private and secure way to do what is currently done via downloading and running a native application is a privacy and security improvement rather than a downgrade. It's a game changer on desktop operating systems where there's no mandatory sandbox such as Windows, macOS and desktop Linux. Mozilla's current stance is effectively that downloading and running native code is safer than granting a web site access to a specifically chosen USB device. It's no harder to download and run a native application with Firefox than it is to select a specific USB device and grant access to it in a Chromium-based browser. Chromium's approach is in fact far more private and secure. If you want to make it harder to abuse, you can require users to go into a browser menu to enable WebUSB before access can be requested, where you can explain what WebUSB is and that USB devices could have a vulnerability in them. Firefox already grants extensive access to the GPU by default via WebGPU which is far more of a practical security problem than tricking users into giving access to an obscure USB device with a vulnerability. If these kinds of USB devices are identified, they can be blacklisted, but Firefox doesn't blacklist GPUs when drivers/firmware have known unpatched vulnerabilities including end-of-life drivers/firmware without updates for them anymore and no permission is needed to use them. Chromium already makes it clear access is being granted to a specific USB device, while downloading and running a native application does not make it clear that it's going to get access to all your browser history, browser logins, browser data and also all your other data from other applications, etc. That is in fact not clear to the average user, especially when the main operating systems they use on mobile devices don't work that way and desktop operating systems give the appearance that they don't work that way since some applications do need to ask for permissions. Some USB devices have vulnerabilities which can be exploited, but there's no need to exploit vulnerabilities with Firefox's current approach since the user having to download and run the native application gives access to everything on desktops and the ability to get access to a USB device on mobile via the same dialog which would be shown by the browser. This is also a rare case where developers are highly motivated to use the more private and secure approach, which is WebUSB, since they can avoid making per-platform applications as long as they give up harvesting a bunch of data, hooking into a bunch of applications, etc. as you can see happening with software for updating a Logitech mouse, etc. USB devices should be secured against being plugged into malicious hardware. If they aren't, that's already a problem, and the native app you're forcing people to use instead of a web app can take full advantage of that too. GrapheneOS uses WebUSB for our web installer partly because it avoids users needing to download and run software like Android platform-tools without verifying a signature for it on platforms without these tools available in their repositories. Debian and Ubuntu have horribly out-of-date and broken packages. We avoid these problems with WebUSB and also get portability to far more operating systems including Android and ChromeOS without having to make an Android app. You'd be able to install GrapheneOS from an iPhone too if Safari ever added WebUSB support. With either the WebUSB or traditional install method for GrapheneOS, all of the firmware and the OS are fully cryptographically verified with a key fingerprint displayed at boot for the OS. Users are in fact not trusting that our website isn't compromised if they verify the key fingerprint against one obtained out-of-band. Android devices supporting installing another OS implement a security model for fastboot where you have to approve unlocking and locking, firmware images are verified with downgrade protection and OS images are also verified with downgrade protection after the device is locked again. Tricking someone into booting up fastboot mode, granting access to a computer and unlocking is not a realistic attack vector. If our website was compromised, then the traditional non-WebUSB install method is only safe for people who use signify to verify the zip before running a script from it, but macOS and Windows users don't have a safe way to obtain signify which isn't simply downloading it from another site without verifying that. With either install method, verified boot for firmware and the OS with the key fingerprint shown at boot and hardware attestation via Auditor provides a verification method avoiding having to trust that the computer you used to do the installation wasn't compromised. When Firefox users run into the lack of WebUSB support, we almost always convince them to use Edge (included in Windows), Chrome (included in Android) or Brave (tends to be what we get Linux and macOS users to use). It's often the start of them leaving Firefox behind. I'm not sure what is accomplished by Firefox not implementing it beyond frustrating people and bleeding a little bit of extra market share. In GrapheneOS, our focus is giving users the ability to use the apps and services they choose more privately and securely rather than disallowing them from doing it. We do encourage using more private apps and services, but the OS doesn't force it on people. It's why we have our sandboxed Google Play compatibility layer and just added Android Auto support as an extension to it, Storage Scopes to replace storage/media permissions. Contact Scopes to replace the Contacts permission, etc. because telling users not to do things doesn't work well. Making it possible for them to do it private/securely and making the more private/secure approach low friction is the way to go. People want to get stuff done and will do it in the only way they have available. Not having WebUSB just having people to use native applications, or Chrome/Brave/Edge/etc. I don't see the downside, especially if it starts out behind a toggle in the browser settings so there isn't even an opportunity to show a dialog unless the user goes out of their way to enable it. There's already the option to get people to download/run native code by default, and no toggle blocking that functionality by default. |
I was also unaware of that position. That is great to see and I think that is a very sensible way of approaching it. I would love to see WebSerial receiving the same treatment in the not so distant future, that would make it definitely easier for many device makers that rely on a more generic protocol for their hardware. I also agree with with whitequarks previous remarks. WebUSB is certainly more difficult to handle correctly and I would like to see a much more stringent approach when it comes to the UI flow. Another option that @whitequark and I discussed earlier is to implement WebUSB not as a free for all API but with a similar flow as WebMIDI is currently implemented where the driver is installed permanently locally so that a rouge website can't just deploy a malicious driver. That would be a safer implementation than what Chrome is currently doing. |
@thestinger tbh I didn't read all of your comment because you don't address the point I'm making. I'm not against WebUSB, I only claim that it needs to be implemented in a way that doesn't water down the security expectations of the web platform, and with meaningful user consent / trust chain. The current chromium WebUSB doesn't provide that. |
What are those security expectations? Firefox provides WebGPU without any permission required including with known vulnerable drivers and firmware. It doesn't get disabled when the GPU driver/firmware is known to have unpatched vulnerabilities. There's no dialog to select a GPU to give a bunch of access to a site. It's far more of a real world problem than users explicitly selecting a vulnerable USB device and granting access to it. A website can make extensive use of the GPU without user consent and even when the GPU driver/firmware is clearly unpatched for months, years, etc. Browsers decided GPUs should be secure against remote attacks despite knowing they aren't and made a bunch of access available to them. Why are USB devices so much different? Is the distinction that USB devices can lack an attempt at providing a compatible security model rather than just having major vulnerabilities in their implementation? It would have been possible to require USB devices to advertise support for WebUSB but it's quite late for that now and existing devices aren't ever going to do it. Still possible to do it, but Chromium isn't going to kill WebUSB support for existing devices so it's unclear how to get devices to add that. It was more realistic many years ago and would be nice, but that's not how things played out. Selecting and granting access to a USB device with WebUSB is comparable to the friction involved in running a native application but at least involves explicitly granting access to something instead of that going unstated. It's not as if there's any attempt to describe what a native application can do when you open it from downloads. Downloading things is part of the web platform too. There's even a UI to execute files without opening a file manager. Is that slated for removal? |
There's very little leverage to get device makers or developers to do something new since Chromium-based browsers already have the feature. It would have been nice to require devices to opt into WebUSB but I don't see that happening now. If you want that to happen, Mozilla or Apple needs to make it clear that they will add WebUSB with that constraint and publish a standard for advertising support for it as soon as possible so devices can start doing it. All these years of simply saying no is why that doesn't exist and the possibility of it existing keeps shrinking. Most devices that are relevant could already support it. I'm sure Android would have it for fastboot and adb. Both of those are great examples of dangerous protocols where there's a hidden way to enable them (enabling developer options, then OEM unlocking and ADB debugging, then approving keys for ADB and approving unlocking for fastboot). They're a perfect example of how granting USB access can be dangerous, but only combined with very explicit steps to enable them. If you don't want to allow doing dangerous things with WebUSB, then you don't support what we want from it which is a safer, easier way to replace the OS on a device you own. The whole point is that we want to do something that could be considered dangerous. Android, Windows and ChromeOS come with a browser with WebUSB support. macOS doesn't, but it's not hard for people to install one. iOS is the only platform with leverage over developers on this and many other things. They can force people to do things the way they want because there's no alternative. The only option is joining their MFi program and using a native app. It's not an option for software not made by the device vendor themselves. There's always an easy alternative to using Firefox, usually one people already have, unlike iOS where people own the device but can't do what they want.. |
To reiterate, this is not possible to do for devices made on some very popular ASICs, and that's not something that will change any time soon. "Should" isn't something that will change decades of existing practice of trusting the host, nor is it something that will create cost-effective ways of adding USB capabilities to hardware. (The comparison with the native app is irrelevant since the browser is providing a security boundary where a desktop does not. The browser has this problem because the browser is better at enforcing boundaries than the desktop, and because there is interest in preserving this enforcement of boundaries.) Just to be clear, I do not want to apriori rule out reprogramming of USB devices through WebUSB. I also have a use case relying on that entirely. I want to rule out unintentional and malicious reprogramming of USB devices, and I am saddened by the fact that this is very difficult at best. |
Our use case is reprogramming the USB device by installing GrapheneOS on it. Android can and does provide software-based USB peripheral functionality from the OS: MTP/PTP file transfer, Ethernet tethering, MIDI and now also webcam support since Android 14 QPR1 including using the microphone(s). It would be entirely possible and quite useful for it to also provide the ability to use it as a keyboard, touchpad, etc. It shows that it could be trivially used to attack the host by acting as a keyboard. This is our use case which is completely reasonable and very well secured. We secure TLS to the extent that's possible with current browsers. We use CAA with Users have to enable developer options, enable OEM unlocking, reboot the phone in fastboot mode, trigger unlocking from the browser, accept a dialog on the phone via volume up/down + power and then the browser can freely flash the device. The extra authorization steps matter since an attached computer shouldn't be able to those without permission. If a device allows that then that's already a problem without WebUSB. Anything it's plugged into including a charger could flash new firmware/software. A significant majority of users are already on Chromium-based browsers with WebUSB. Many Firefox users already have a Chromium-based browser around for the ever increasing number of sites not working in Firefox because of cases like this one. The vast majority of GrapheneOS users install it via the web installer. We strongly encourage it and discourage anyone inexperienced from using the CLI install process instead. There's very little to use the CLI install process for most users. There are some benefits if you're on a distribution like Arch Linux where you can get signify and up-to-date fastboot from the distribution repositories so you can verify the zip signature before flashing and then verify the key fingerprint again after flashing as web install users will be doing. Developers will be using CLI flashing, so they might as well learn how to do it, but flashing development builds is a different process and the OS build produces a fastboot executable so you don't need it from elsewhere. Mozilla's positions on many of these standards are the same as Apple, and Apple's position is clearly based on anti-competitive behavior to promote than app ecosystem. Mozilla is doing the same thing: promoting native apps over web apps. Mozilla's position is "no, use a native executable which we allow you to easily download" rather than "no, that's too dangerous" because you do not make it difficult to download and run native executables.
Downloads are part of the web platform. There are even programmatic APIs for downloads. Running an executable directly via Firefox's UI is something that's fully supported and there's very little friction involved. On Android, Firefox has support for being granted the ability to request to install/update apps. It will directly request permission to be able to install/update apps. It's part of what it was built to provide. I don't think that can be ignored. It's not something that is provided as part of providing downloads but something that has been explicitly added. Otherwise, you'd have to open the executables in the file manager. Some desktop platforms have terrible code signing systems (macOS, Windows) and may add an extra dialog to this, but it's not a lot of friction. It's relevant that this is already easy for users to do in a much less private and less secure way, which Mozilla is forcing users to do through not providing a better alternative. WebUSB is a better alternative to downloading and running a native executable from a website. WebUSB is not perfect, and WebUSB can be dangerous, but the fact that it's a privacy and security improvement over the status quo makes not including it anti-privacy and anti-security. The real world privacy and security provided to end users is what matters, and in the real world downloading and running a native executable is the status quo. Chromium replacing that status quo has been very beneficial whether people like it or not. When Mozilla was making FirefoxOS, these features were explicitly on the roadmap. There was a goal of making web apps as capable as native apps, including providing these capabilities and a signing security model comparable to native apps. At some point, Mozilla forgot they were trying to turn the web platform into an alternative to platform specific native apps. If certain capabilities are regarded as too dangerous for regular web pages, then provide proper installable apps with key pinning. key rotation and downgrade protection. Could even make an app store for them. If all we had to do was bundle up our web installer into a Mozilla app store bundle and submit it, we'd do that, but we're not writing code specific to Firefox. I think the problem is ultimately that Mozilla simply says no instead of having a vision for capable web apps and building that vision. Google are the ones with a vision for web apps and are the ones building that, so Mozilla gets so no say in how it happens because they're not acting as the builders but rather critics on the sidelines providing no alternative. |
I do agree that this would be a strict upgrade compared to downloading a native executable on a desktop platform, and I do not see a rational argument based on security concerns against providing an API like WebUSB behind this requirement. (It would be equivalent or better than installing a native extension polyfilling WebUSB through the use of native messaging, which is already possible, as demonstrated by the WebSerial polyfill.) |
After reading this thread, I'm shocked.. I didn't expect Mozilla to be that unreasonable and detached from reality.. The web is more secure than native apps because here we have permissions. While a proper signing (not TLS) would make this it even more secure, you can't expect the developers to wait for Firefox to ship it.. They just instruct their users to use Chromium-based browsers instead, which makes Firefox fade further into irrevelance.. this is happening right now, in the real world. Wake up, Mozilla! |
Here's an attempt to add a position for WebUSB. I'm particularly interested in feedback on the wording here; is this the justification we should use; should we raise other issues, etc.?