{"payload":{"feedbackUrl":"https://github.com/orgs/community/discussions/53140","repo":{"id":201991558,"defaultBranch":"main","name":"Olive","ownerLogin":"microsoft","currentUserCanPush":false,"isFork":false,"isEmpty":false,"createdAt":"2019-08-12T19:00:23.000Z","ownerAvatar":"https://avatars.githubusercontent.com/u/6154722?v=4","public":true,"private":false,"isOrgOwned":true},"refInfo":{"name":"","listCacheKey":"v0:1726699688.0","currentOid":""},"activityList":{"items":[{"before":"d2bc1c4f6fff1921b47a883f4b907dc83d413796","after":"504fce6c21a235bba6039dc0281e3365de7d7d12","ref":"refs/heads/shaahji/gptq","pushedAt":"2024-09-19T16:58:17.000Z","pushType":"force_push","commitsCount":0,"pusher":{"login":"shaahji","name":null,"path":"/shaahji","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/96227573?s=80&v=4"},"commit":{"message":"Make certain gptq options customizable via model specific mapping\n\nThis is to avoid hardcoding these paramters in config files for models that\naren't (like phi3) yet officially supported by auto-gptq.","shortMessageHtmlLink":"Make certain gptq options customizable via model specific mapping"}},{"before":"68c9178853c78b538d8423d503bbf256a6ec43d2","after":"84e4928820e8bc702e187dda303d0d25d640ba61","ref":"refs/heads/gh-pages","pushedAt":"2024-09-18T22:54:21.000Z","pushType":"push","commitsCount":1,"pusher":{"login":"azure-pipelines[bot]","name":null,"path":"/apps/azure-pipelines","primaryAvatarUrl":"https://avatars.githubusercontent.com/in/9426?s=80&v=4"},"commit":{"message":"Update docs from 9301aae1ec50c9345ffc3d1c9333b38ad74c5178","shortMessageHtmlLink":"Update docs from 9301aae"}},{"before":"add2c8ddab1359f23a5bb84ca3c70c1200a72128","after":"5712858554d9359aa34d05493558eb2371c3412f","ref":"refs/heads/jambayk/iter-metric","pushedAt":"2024-09-18T22:51:54.000Z","pushType":"force_push","commitsCount":0,"pusher":{"login":"jambayk","name":"Jambay Kinley","path":"/jambayk","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/94929125?s=80&v=4"},"commit":{"message":"save for now","shortMessageHtmlLink":"save for now"}},{"before":"c4fa8ab82f10b6947848e12ba6a12301d7333e10","after":null,"ref":"refs/heads/jambayk/auto-opt-cli","pushedAt":"2024-09-18T22:48:08.000Z","pushType":"branch_deletion","commitsCount":0,"pusher":{"login":"jambayk","name":"Jambay Kinley","path":"/jambayk","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/94929125?s=80&v=4"}},{"before":"fa8cc98011772be661d1a417c0076841a64dc29b","after":"7b80d5c6377c9c795800f493b749905e9fbd307f","ref":"refs/heads/auto-opt-cli","pushedAt":"2024-09-18T22:47:34.000Z","pushType":"push","commitsCount":10,"pusher":{"login":"jambayk","name":"Jambay Kinley","path":"/jambayk","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/94929125?s=80&v=4"},"commit":{"message":"Merge branch 'main' into auto-opt-cli","shortMessageHtmlLink":"Merge branch 'main' into auto-opt-cli"}},{"before":null,"after":"c4fa8ab82f10b6947848e12ba6a12301d7333e10","ref":"refs/heads/jambayk/auto-opt-cli","pushedAt":"2024-09-18T22:46:11.000Z","pushType":"branch_creation","commitsCount":0,"pusher":{"login":"jambayk","name":"Jambay Kinley","path":"/jambayk","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/94929125?s=80&v=4"},"commit":{"message":"merge main","shortMessageHtmlLink":"merge main"}},{"before":"438769fc44ccb735b43e3c9397866191a54257fc","after":null,"ref":"refs/heads/jambayk/separate-cli","pushedAt":"2024-09-18T22:42:08.000Z","pushType":"branch_deletion","commitsCount":0,"pusher":{"login":"jambayk","name":"Jambay Kinley","path":"/jambayk","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/94929125?s=80&v=4"}},{"before":"9fa260403db4aadf2171fbc125c7654a59e7d1e4","after":"9301aae1ec50c9345ffc3d1c9333b38ad74c5178","ref":"refs/heads/main","pushedAt":"2024-09-18T22:42:06.000Z","pushType":"pr_merge","commitsCount":1,"pusher":{"login":"jambayk","name":"Jambay Kinley","path":"/jambayk","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/94929125?s=80&v=4"},"commit":{"message":"Engine: Improve output structure, CLI: Configurable model options, Separate `finetune`+`generate-adapter` (#1361)\n\n## Describe your changes\r\n**Engine**:\r\n- Improve the output folder structure of a workflow run. The current\r\nstructure was meant for multiple-ep, multiple-passflow worfklows but\r\nthat is not the common usage for olive.\r\n- Unnecessary nesting for accelerator spec and pass flows is removed for\r\nsingle ep, single passflow scenario.\r\n- `output_name` is removed from both pass config and engine config.\r\n- The behavior of `output_name` is arbitrary. User can get the output in\r\na specific folder by directly providing the `output_dir` like\r\n`parent-dir/specific-dir`.\r\n- `output_name` was allowed for pass config to save intermediate models.\r\nBut this can be achieved by providing multiple pass flows like `[[A, B],\r\n[A, B, C]]`. This is cleaner than the former.\r\n- Refer to `Engine.run` for more details on the new output structure.\r\n\r\n**CLI**:\r\n- `add_model_options` is made configurable so that only the desired\r\nmodel type related options are added.\r\n- `save_output_model` uses the new engine output directory structure to\r\ncopy the output model into the final output directory.\r\n- `finetune` command separated into `finetune` and `generate-adapter`\r\ncommands. These commands can be chained as shown in the llama2 multilora\r\nnotebook.\r\n\r\n## Checklist before requesting a review\r\n- [x] Add unit tests for this change.\r\n- [x] Make sure all tests can pass.\r\n- [x] Update documents if necessary.\r\n- [x] Lint and apply fixes to your code by running `lintrunner -a`\r\n- [ ] Is this a user-facing change? If yes, give a description of this\r\nchange to be included in the release notes.\r\n- [ ] Is this PR including examples changes? If yes, please remember to\r\nupdate [example\r\ndocumentation](https://github.com/microsoft/Olive/blob/main/docs/source/examples.md)\r\nin a follow-up PR.\r\n\r\n## (Optional) Issue link","shortMessageHtmlLink":"Engine: Improve output structure, CLI: Configurable model options, Se…"}},{"before":"e9afcd9cd6140e3a0b281206f98c124c37d0c5d5","after":"f38fa1cadf29529b133d7d3fd973c35fd7eebe54","ref":"refs/heads/shaahji/cliquant","pushedAt":"2024-09-18T22:07:49.000Z","pushType":"force_push","commitsCount":0,"pusher":{"login":"shaahji","name":null,"path":"/shaahji","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/96227573?s=80&v=4"},"commit":{"message":"Quantize: CLI command to quantize input model\n\nUsage:\n olive quantize \\\n -m \\\n --trust_remote_code \\\n --device \\\n --algorithms \\\n --data_name \\\n --train_subset \\\n --batch_size \\\n --tempdir \\\n -o ","shortMessageHtmlLink":"Quantize: CLI command to quantize input model"}},{"before":"188873ad69d3f5930adfbc406d5211205f2b8cfa","after":"d2bc1c4f6fff1921b47a883f4b907dc83d413796","ref":"refs/heads/shaahji/gptq","pushedAt":"2024-09-18T22:05:30.000Z","pushType":"force_push","commitsCount":0,"pusher":{"login":"shaahji","name":null,"path":"/shaahji","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/96227573?s=80&v=4"},"commit":{"message":"Make certain gptq options customizable via model specific mapping\n\nThis is to avoid hardcoding these paramters in config files for models that\naren't (like phi3) yet officially supported by auto-gptq.","shortMessageHtmlLink":"Make certain gptq options customizable via model specific mapping"}},{"before":"e4e7ab58cc4c9a8d577e91c5c1bbc5469b4e7394","after":null,"ref":"refs/heads/jambayk/engine-out","pushedAt":"2024-09-18T22:04:09.000Z","pushType":"branch_deletion","commitsCount":0,"pusher":{"login":"jambayk","name":"Jambay Kinley","path":"/jambayk","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/94929125?s=80&v=4"}},{"before":"f5a00e41ecf9737e3ef662c4faa662965ef178fd","after":"438769fc44ccb735b43e3c9397866191a54257fc","ref":"refs/heads/jambayk/separate-cli","pushedAt":"2024-09-18T21:57:24.000Z","pushType":"push","commitsCount":1,"pusher":{"login":"jambayk","name":"Jambay Kinley","path":"/jambayk","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/94929125?s=80&v=4"},"commit":{"message":"fix test","shortMessageHtmlLink":"fix test"}},{"before":"8c0bf4fb4ff8a85c71836af2388ffc26331a8a76","after":"f5a00e41ecf9737e3ef662c4faa662965ef178fd","ref":"refs/heads/jambayk/separate-cli","pushedAt":"2024-09-18T21:40:01.000Z","pushType":"push","commitsCount":1,"pusher":{"login":"jambayk","name":"Jambay Kinley","path":"/jambayk","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/94929125?s=80&v=4"},"commit":{"message":"same","shortMessageHtmlLink":"same"}},{"before":"5a4755e5147c65bc8da6ec5c45dcf41d0ef753fc","after":"8c0bf4fb4ff8a85c71836af2388ffc26331a8a76","ref":"refs/heads/jambayk/separate-cli","pushedAt":"2024-09-18T21:29:19.000Z","pushType":"push","commitsCount":3,"pusher":{"login":"jambayk","name":"Jambay Kinley","path":"/jambayk","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/94929125?s=80&v=4"},"commit":{"message":"generate-adapter supports onnx model","shortMessageHtmlLink":"generate-adapter supports onnx model"}},{"before":"7a4263f77cc3ad5570fbce5df414ee9b4ccb8700","after":"188873ad69d3f5930adfbc406d5211205f2b8cfa","ref":"refs/heads/shaahji/gptq","pushedAt":"2024-09-18T20:51:17.000Z","pushType":"force_push","commitsCount":0,"pusher":{"login":"shaahji","name":null,"path":"/shaahji","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/96227573?s=80&v=4"},"commit":{"message":"Make certain gptq options customizable via model specific mapping\n\nThis is to avoid hardcoding these paramters in config files for models that\naren't (like phi3) yet officially supported by auto-gptq.","shortMessageHtmlLink":"Make certain gptq options customizable via model specific mapping"}},{"before":null,"after":"7a4263f77cc3ad5570fbce5df414ee9b4ccb8700","ref":"refs/heads/shaahji/gptq","pushedAt":"2024-09-18T20:50:16.000Z","pushType":"branch_creation","commitsCount":0,"pusher":{"login":"shaahji","name":null,"path":"/shaahji","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/96227573?s=80&v=4"},"commit":{"message":"Make certain gptq customizable via model specific mapping\n\nThis is to avoid hardcoding these paramters in config files for models that\naren't (like phi3) yet officially supported by auto-gptq.","shortMessageHtmlLink":"Make certain gptq customizable via model specific mapping"}},{"before":"8b75d7bbef2883d0f6f0e4d166e33baed5135e9f","after":"e9afcd9cd6140e3a0b281206f98c124c37d0c5d5","ref":"refs/heads/shaahji/cliquant","pushedAt":"2024-09-18T18:47:21.000Z","pushType":"force_push","commitsCount":0,"pusher":{"login":"shaahji","name":null,"path":"/shaahji","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/96227573?s=80&v=4"},"commit":{"message":"Quantize: CLI command to quantize input model\n\nUsage:\n olive quantize --m --device --algorithms --data_config_path -o \n\nFew other code improvements:\n* Moved global function is cli/base.py to be static members of\n cli/base/BaseOliveCLICommand to avoid multiple imports in each cli command\n implementation. Moreover these functions are only useable in the context of\n cli command implementation anyways.\n* Created new new functions (add_data_config_options, add_hf_dataset_options,\n and add_accelerator_options) to cli/base/BaseOliveCLICommand to avoid code\n duplication and standardization across different cli command implementations.","shortMessageHtmlLink":"Quantize: CLI command to quantize input model"}},{"before":"4abfed890df15d9a648a684b33f2d6172a2075c4","after":"8b75d7bbef2883d0f6f0e4d166e33baed5135e9f","ref":"refs/heads/shaahji/cliquant","pushedAt":"2024-09-18T18:17:36.000Z","pushType":"force_push","commitsCount":0,"pusher":{"login":"shaahji","name":null,"path":"/shaahji","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/96227573?s=80&v=4"},"commit":{"message":"Quantize: CLI command to quantize input model\n\nUsage:\n olive quantize --m --device --algorithms --data_config_path -o \n\nFew other code improvements:\n* Moved global function is cli/base.py to be static members of\n cli/base/BaseOliveCLICommand to avoid multiple imports in each cli command\n implementation. Moreover these functions are only useable in the context of\n cli command implementation anyways.\n* Created new new functions (add_data_config_options, add_hf_dataset_options,\n and add_accelerator_options) to cli/base/BaseOliveCLICommand to avoid code\n duplication and standardization across different cli command implementations.","shortMessageHtmlLink":"Quantize: CLI command to quantize input model"}},{"before":"b7569f6214f720c9572b0ce5aaf7a7f1fa5376bf","after":"5a4755e5147c65bc8da6ec5c45dcf41d0ef753fc","ref":"refs/heads/jambayk/separate-cli","pushedAt":"2024-09-18T01:58:01.000Z","pushType":"push","commitsCount":1,"pusher":{"login":"jambayk","name":"Jambay Kinley","path":"/jambayk","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/94929125?s=80&v=4"},"commit":{"message":"nit","shortMessageHtmlLink":"nit"}},{"before":"bf914cdbb33e0fe39b638b628ecb43c1ea34fcee","after":"b7569f6214f720c9572b0ce5aaf7a7f1fa5376bf","ref":"refs/heads/jambayk/separate-cli","pushedAt":"2024-09-18T01:45:01.000Z","pushType":"push","commitsCount":1,"pusher":{"login":"jambayk","name":"Jambay Kinley","path":"/jambayk","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/94929125?s=80&v=4"},"commit":{"message":"multilora example update","shortMessageHtmlLink":"multilora example update"}},{"before":"a06d59d66c6d0dec572166cde26ccabc7d60a42d","after":"bf914cdbb33e0fe39b638b628ecb43c1ea34fcee","ref":"refs/heads/jambayk/separate-cli","pushedAt":"2024-09-18T01:09:36.000Z","pushType":"push","commitsCount":1,"pusher":{"login":"jambayk","name":"Jambay Kinley","path":"/jambayk","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/94929125?s=80&v=4"},"commit":{"message":"base improvement","shortMessageHtmlLink":"base improvement"}},{"before":"852bed17e9a4eda7efcba00cc8f30fa2407bcafc","after":"a06d59d66c6d0dec572166cde26ccabc7d60a42d","ref":"refs/heads/jambayk/separate-cli","pushedAt":"2024-09-18T00:04:15.000Z","pushType":"force_push","commitsCount":0,"pusher":{"login":"jambayk","name":"Jambay Kinley","path":"/jambayk","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/94929125?s=80&v=4"},"commit":{"message":"run out and separate finetune","shortMessageHtmlLink":"run out and separate finetune"}},{"before":"70e0d28a7545b520515dc9923631fa6fb9b67221","after":"852bed17e9a4eda7efcba00cc8f30fa2407bcafc","ref":"refs/heads/jambayk/separate-cli","pushedAt":"2024-09-17T23:39:58.000Z","pushType":"force_push","commitsCount":0,"pusher":{"login":"jambayk","name":"Jambay Kinley","path":"/jambayk","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/94929125?s=80&v=4"},"commit":{"message":"run out and separate finetune","shortMessageHtmlLink":"run out and separate finetune"}},{"before":"7cc7299c3607503bcaeb32689831a19f2f3873ee","after":"4abfed890df15d9a648a684b33f2d6172a2075c4","ref":"refs/heads/shaahji/cliquant","pushedAt":"2024-09-17T23:01:17.000Z","pushType":"force_push","commitsCount":0,"pusher":{"login":"shaahji","name":null,"path":"/shaahji","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/96227573?s=80&v=4"},"commit":{"message":"Quantize: CLI command to quantize input model\n\nUsage:\n olive quantize --m --device --algorithms --data_config_path -o \n\nFew other code improvements:\n* Moved global function is cli/base.py to be static members of\n cli/base/BaseOliveCLICommand to avoid multiple imports in each cli command\n implementation. Moreover these functions are only useable in the context of\n cli command implementation anyways.\n* Created new new functions (add_data_config_options, add_hf_dataset_options,\n and add_accelerator_options) to cli/base/BaseOliveCLICommand to avoid code\n duplication and standardization across different cli command implementations.","shortMessageHtmlLink":"Quantize: CLI command to quantize input model"}},{"before":"27ab7d41ac246746dc1701720da5437260611bf5","after":"e4e7ab58cc4c9a8d577e91c5c1bbc5469b4e7394","ref":"refs/heads/jambayk/engine-out","pushedAt":"2024-09-17T22:50:29.000Z","pushType":"push","commitsCount":1,"pusher":{"login":"jambayk","name":"Jambay Kinley","path":"/jambayk","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/94929125?s=80&v=4"},"commit":{"message":"run cli command in subprocess","shortMessageHtmlLink":"run cli command in subprocess"}},{"before":"febf3738cf3684a48211cfb1939c277df7abe0f2","after":"27ab7d41ac246746dc1701720da5437260611bf5","ref":"refs/heads/jambayk/engine-out","pushedAt":"2024-09-17T21:54:08.000Z","pushType":"push","commitsCount":1,"pusher":{"login":"jambayk","name":"Jambay Kinley","path":"/jambayk","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/94929125?s=80&v=4"},"commit":{"message":"fix flow output dir","shortMessageHtmlLink":"fix flow output dir"}},{"before":"4f4b348a3900b6e843b586edd67132cb338d8a62","after":"febf3738cf3684a48211cfb1939c277df7abe0f2","ref":"refs/heads/jambayk/engine-out","pushedAt":"2024-09-17T21:38:16.000Z","pushType":"push","commitsCount":1,"pusher":{"login":"jambayk","name":"Jambay Kinley","path":"/jambayk","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/94929125?s=80&v=4"},"commit":{"message":"fix et3 + doc","shortMessageHtmlLink":"fix et3 + doc"}},{"before":"6ec5767a48903419a0413fec34183dc4b1c9dc52","after":"7cc7299c3607503bcaeb32689831a19f2f3873ee","ref":"refs/heads/shaahji/cliquant","pushedAt":"2024-09-17T21:35:19.000Z","pushType":"force_push","commitsCount":0,"pusher":{"login":"shaahji","name":null,"path":"/shaahji","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/96227573?s=80&v=4"},"commit":{"message":"Quantize: CLI command to quantize input model\n\nUsage:\n olive quantize --m --device --algorithms --data_config_path -o \n\nFew other code improvements:\n* Moved global function is cli/base.py to be static members of\n cli/base/BaseOliveCLICommand to avoid multiple imports in each cli command\n implementation. Moreover these functions are only useable in the context of\n cli command implementation anyways.\n* Created new new functions (add_data_config_options, add_hf_dataset_options,\n and add_accelerator_options) to cli/base/BaseOliveCLICommand to avoid code\n duplication and standardization across different cli command implementations.","shortMessageHtmlLink":"Quantize: CLI command to quantize input model"}},{"before":"91f117414f9021fd4dd10713fea3ba8c4f1cc2e0","after":"4f4b348a3900b6e843b586edd67132cb338d8a62","ref":"refs/heads/jambayk/engine-out","pushedAt":"2024-09-17T21:06:17.000Z","pushType":"push","commitsCount":1,"pusher":{"login":"jambayk","name":"Jambay Kinley","path":"/jambayk","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/94929125?s=80&v=4"},"commit":{"message":"fix et2","shortMessageHtmlLink":"fix et2"}},{"before":"f186bb44f284d4e39cf4415ca4a35c53d0ecf23b","after":"91f117414f9021fd4dd10713fea3ba8c4f1cc2e0","ref":"refs/heads/jambayk/engine-out","pushedAt":"2024-09-17T20:13:52.000Z","pushType":"push","commitsCount":1,"pusher":{"login":"jambayk","name":"Jambay Kinley","path":"/jambayk","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/94929125?s=80&v=4"},"commit":{"message":"fix et","shortMessageHtmlLink":"fix et"}}],"hasNextPage":true,"hasPreviousPage":false,"activityType":"all","actor":null,"timePeriod":"all","sort":"DESC","perPage":30,"cursor":"Y3Vyc29yOnYyOpK7MjAyNC0wOS0xOVQxNjo1ODoxNy4wMDAwMDBazwAAAAS7Hqyh","startCursor":"Y3Vyc29yOnYyOpK7MjAyNC0wOS0xOVQxNjo1ODoxNy4wMDAwMDBazwAAAAS7Hqyh","endCursor":"Y3Vyc29yOnYyOpK7MjAyNC0wOS0xN1QyMDoxMzo1Mi4wMDAwMDBazwAAAAS480OB"}},"title":"Activity · microsoft/Olive"}