Data breaches can and do happen. It's true of any app you use, but because of the amount of data gen AI models are trained on, there's a lot at stake. Researchers got ChatGPT to reproduce parts of its training data and the tool has shared entire chats with different users. Practice good data security and never put anything into an AI tool that you wouldn't post publicly online.
Freeware
AI tools are extremely expensive to develop and run, and yet many companies offer free access to at least some of their features. Big companies have lost money on their AI offerings. So why provide anything for free? It isn't out of the kindness of their hearts. They're getting something other than money from you that has value to them. This may be your data, your labor, or something else, and it may be something you aren't aware of giving.
Check the Terms of Use carefully for any software you use. For instance, Google's Gemini Apps Privacy Hub states they can retain your data, including conversations, location, and device type, for up to three years. According to OpenAI's Privacy Policy they "may provide your Personal Information to third parties without further notice to you, unless required by the law." Most tools, especially free ones, say something similar. Be very careful of when and how you use them,
Microsoft Copilot
By logging in with your VSC email address, you have access to Copilot with commercial data protection. This means that your data is private to you and protected from Microsoft (and the VSC). Your inputs won't be used to train the model, although you still shouldn't share private or sensitive information. You can log into Copilot on any browser at https://copilot.microsoft.com/ or access the most features by logging into the Edge browser. Look for the green "Protected" shields at the upper right and just above the textbox:
Upper right:
Above the textbox: