Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Directive on Automated Decision-Making; Releasing source code. #72

Open
KingBain opened this issue Jul 30, 2019 · 4 comments
Open

Directive on Automated Decision-Making; Releasing source code. #72

KingBain opened this issue Jul 30, 2019 · 4 comments
Assignees

Comments

@KingBain
Copy link
Contributor

KingBain commented Jul 30, 2019

Has anyone noticed the wording they added to the directive for AI on release source code.
https://www.tbs-sct.gc.ca/pol/doc-eng.aspx?id=32592

------------------------------------------------
Release of Source Code
6.2.6
Releasing custom source code owned by the Government of Canada as per the requirements specified in section C.2.3.8 of the Directive on Management of Information Technology, unless:

6.2.6.1 The source code is processing data classified SECRET, TOP SECRET or PROTECTED C;

6.2.6.2 Disclosure would otherwise be exempted or excluded under the Access to Information Act, or

6.2.6.3 An exemption is provided by the Chief Information Officer of Canada.

6.2.7 Determining the appropriate access restrictions to the released source code.
------------------------------------------------

Why do you think source code for things that process secret, top secret and protected C need a special process to be released ?

Processing could mean anything... like a test script(TDD) or spellcheck or whatever ?!

@pabrams
Copy link
Contributor

pabrams commented Jul 30, 2019

I think even if it's AI-related code, the most you can get, although I'm no AI programmer so don't take my word for it, is metadata information such as size and type of fields. (Even that might be considered configuration, and stored separately.)
People can still make mistakes like storing data in source control, which does happen (AI or not), but I think everyone would agree that's a separate problem that shouldn't prevent sharing code in general.
It feels like they're trying to prevent people from open-sourcing the output of machine learning or other processes that have already processed sensitive data, but according to their definition of source code, that's excluded anyway.

@pabrams
Copy link
Contributor

pabrams commented Jul 30, 2019

I couldn't see how to officially report the problem, so I tweeted to the CIO, who graciously forwarded the issue to an appropriate person.

@gcharest
Copy link
Member

Very good question and I unfortunately can't answer myself.

@MrDeshaies
Copy link

https://twitter.com/MrDeshaies/status/1171471348373213185

There's more experience classifying information than code. The current wording considers where/how the code is being used. The plan is to align the Directive with the Open Source Guidelines once they're published.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants