Support for X509SubjectName Name ID
cantor.2 at osu.edu
Fri May 15 19:26:33 UTC 2020
On 5/15/20, 3:09 PM, "Ullfig, Roberto Alfredo" <rullfig at uic.edu> wrote:
> How many InCommon SPs are probably insecure in this fashion?
Less than the much larger pool of cloud apps that aren't part of InCommon, but if you're asking if it's zero, probably not.
Sorry if this is all "Pandora's Box", but...
I don't think of the systems I've found, offhand, any of them were InCommon members, actually. I don't "credit" InCommon with that, any more than I would blame it if it weren't the case.
> Does InCommon insert WantAssertionsSigned="true" into the metadata automatically
I don't think InCommon even supports setting the flag. Part of that has to do with not encouraging the use of broken systems. If you have to set the flag, the chance of it being for a good reason are almost nil, so supporting it coddles and incentivizes bad behavior.
InCommon is not positioned to help with this because InCommon is not the one buying or integrating the products and can't perform the testing. It could provide for information sharing in the manner of a REN-ISAC to help identify and expose these bad actors, most of whom are not members anyway, but there hasn't been much progress there. I tried to initiate some discussion about that when I found a half dozen broken from the comment/truncation bug in 2018 or whenever it was.
> or does it have anything else that would protect against such a SP?
You can't prevent bad software from operating with metadata. Setting the flag doesn't "fix" anything. It hides the problem by making you think the signature matters when the system actually doesn't check it.
Bugs like this can only be found through active testing.
> Would it be a good idea when releasing attributes to a new Service Provider to test forcing both values to false, after
> checking the metadata and see if the application sign-in works or not?
Only you (or your management) can decide how much work you're obligated to do to protect the institution. I can tell you this is the tip of the iceberg and these are the *easy* tests.
Obviously the chances are the data you're releasing is not the point. The issue is the data or business process being exposed by the service. After the tenth "who cares, it's nothing important" response you get, you start to decide that some people deserve what they get and worry more about protecting yourself by making sure you're transparent about what you do or don't do.
There is an unfortunate analogy in here somewhere to the question of how much COVID testing to do.
More information about the users