Last Updated January 12, 2025
Macro leverages existing AI technologies for various functionalities within the app.
<aside> <img src="/icons/thought-dialogue_gray.svg" alt="/icons/thought-dialogue_gray.svg" width="40px" />
Wondering which to use? Contact [email protected] with a description of your use-case for further help choosing the best LLM model.
</aside>
Macro utilizes AI integrations in the following features:
Loading in any of these blocks within the app will trigger the API for the chosen vendor and file, chat, and other use data will be shared. This cannot be turned off or altered.
Macro does not host any cloud instances for AI integrations, and utilizes API endpoints to access each service. Transmissions are encrypted at-rest and in-transit at all times.
Macro does not own or operate any CSP instances internally to access these LLMs. Please consult each LLM provider’s API documentation for further security information.
Macro and the chosen AI provider processes the entire file’s worth of data — text, images, code, etc. — when using AI features in the app. Access cannot be restricted to specific portions or data types in any file.
Data processed by AI features is never stored in any external AI CSP, only via our internal AWS instances. As well, no data is ever used for any external model training or honing, and no external AI CSPs will access the data post-processing: