"Ghost text" for Copilot completions

The community has been looking into building their own Copilot extension, but the biggest issue right now is how to achieve a great UX (Completion UI considerations · Issue #3 · gobijan/nova-copilot-lsp · GitHub).

Copilot on VSCode introduced a special interaction model called “ghost text” to make it very easy to confirm suggestions or discards them (Visual Studio Code and GitHub Copilot AI)

Is there any chance we could get a “ghost text” API for Nova? Copilot seems to become table-stakes in an editor and there will be a few local-LLM alternatives very soon. I think it would pay off to enable the community to build extension with a great completion experience.



I will definitely note your request! While I cannot guarantee it will be something we do, as our list of requested API improvements is long, we will continue to evaluate based on popularity and how it might impact or improve other extensions as well.

1 Like

This would be fantastic. Demand for Copilot and similar tools will keep growing. It does seem like “ghost text” is the minimum feature needed to make it work well. Rider also uses the “ghost text” method.



I’ve been a fan of Nova for its native design and performance. However, I’ve noticed that I’ve been using it less and less because of its lack of support for integrated AI coding assistants. I understand the Nova core team’s concerns regarding legal liability, as previously discussed in this thread. However, with alternative AI solutions like Ollama and Continue, developers now have the option to use publicly available or self-hosted language models for coding assistance.

These AI tools have all standardized the “ghost-text” interface as the most natural and intuitive, and as such, I would love to see Nova consider its inclusion in a future release.


1 Like

Thank you for the info! As you’ve said, we don’t have plans ourselves for integration of any AI services, but inclusion of such an annotation mechanism as inline suggestions is very likely, it’s mostly prioritizing it with other feature requests at the moment.

Nova only has one pair of full-time engineer and QA lead, so a lot of user-level features are currently taking priority before some more extension stuff, unfortunately, but the weight of these requests is still on my mind!


Thanks for the quick response! I understand the constraints of a small team. Looking forward to any future updates when time allows. Keep up the great work!

1 Like

I just wanted to add my support for something like this. If there’s one thing that is going to get me to use something other than Nova it’s AI coding integration.

It’s a bit disappointing to me that the power of modern Macs to run good local LLMs is not being taken advantage of among the Mac-native editors, so I’d really like something that made that possible. It seems like a natural fit, but might require more glue code.

(On this front, unfortunately BBEdit has just integrated ChatGPT, not taking advantage of local LLMs.)

It’s a bit disappointing to me that the power of modern Macs to run good local LLMs is not being taken advantage of among the Mac-native editors, so I’d really like something that made that possible. It seems like a natural fit, but might require more glue code.

Preface: my engineering background does not intersect with LLMs in the slightest, so take what I say next with an adequite number of grains of salt. My knowledge is mostly pieced together by reading high-level overviews and whitepapers, as well as talking to colleagues with far more understanding of these things.

My understanding was that while Apple’s CoreML on-device works extremely well for language models that can operate on, say, your personal photo library or a local audio library, accomplishing the same effects as services like GitHub Copilot would be quite a bit more difficult locally. Not necessarily because of the lack of processing power, but because of the lack of diverse input data as a user.

Microsoft has very proudly stated that Copilot’s LLM runs across massive datacenters feed by the bulk of public repositories hosted by GitHub, which greatly drives how it can provide such accurate code completion and reasoning for a breadth of programming languages and scenarios.

As far as I am aware, attempting to train a similar code intelligence model purely locally against a user’s own projects would definitely not have have the depth necessary to make it viable for general coding assistance. It would need to be trained and fed on a reasonably large, preferably public domain or fair use input set with a great enough language range.

There are a couple of additional models I’ve seen in passing that were (potentially) trained in such a way (like Meta’s various CodeLlama iterations, and Tabby which I believe uses CodeLlama internally by default), but I am not fully aware of the implications of how these would behave for a user locally.

Tabby, in particular, advertises itself as “self-hostable” and mentions using CoreML when running on Mac hardware, but also has somewhat hefty local GPU requirements as well as the need for the Rust runtime. Looking at its installation requirements, it also requires multiple tens of gigabytes of hard drives space to store the predetermined training data, which is obviously not something we could ship in a Mac app unless it were a separate download of some kind (completly ignoring the ethical / legal implications of these two particular models, of which I have not researched to the point of being able to say “oh, we could use this.”)

1 Like

Thanks for the thoughtful reply. It’s got me trying to clarify what I actually want from Nova here.

My current workflow is to get an LLM to write something, paste it into Nova, edit it, integrate it, and run it. If I get errors, I copy paste all the relevant bits back into the AI and ask it’s what’s wrong (if it’s not obvious to me). I then copy paste back the relevant fixes. All this is incredibly clunky, but it’s still better than not using the AI.

So I guess at a high level, what I want is for my IDE to fix this workflow for me. I can’t imagine still copy-pasting stuff around indefinitely, I will move/use an editor that makes this process seamless.

On the specific implementation, I don’t think it’s even desirable for Nova to have a bundled AI model, things are moving fast enough that it’s something I’d probably want to be able to change/upgrade in any case. As for what’s possible right now with local LLMs, one that can keep just the current file in the context window would be useful. Mistral 7b could do this easily in my experience. It does require a significant amount of RAM, obviously, but I think that’s the user’s choice. Its model size is 7gb. It runs faster than ChatGPT 4 on my M1 Pro with 32gb.

1 Like

I just wanted to add my 2 cents here:

My productivity as a developer improved drastically when I started using copilot, and it’s had the effect of making me switch reluctantly back to VSCode for most things. A ghost text interface that would allow for extensions in Nova to present the same interface as VSCode (and other editors) do for these multi-line suggestions would tip me back to Nova.

My experience is just anecdotal for now, but with how popular AI assistants are getting for coding — not to mention how much they can boost your productivity — I think having support for at least the most common interface to those assistants would be a huge benefit.

As a sidenote, ghost text is also useful for things like in-line git blame’s or other context-specific information. Another place I see this is in VSCode is for function signatures in Elixir projects where the type information is inferred by the LSP and displayed above the function definition, so a ghost-text interface isn’t limited to just LLMs, but is more generally useful for all kinds of information that can be presented in-editor.


I completely agree. There’s a lot of mechanisms I’d like to add and expose to extensions as well; it’ll just take a bit of time as we work through them. I tend to be a bit cautious about adding stuff so I make sure I do it roughly “right” (or in at least the realm of “right”) the first time so we don’t have to upend what folks do with what we provide too often. But that must be balanced with trying to provide what folks need when they need it. I’m still figuring out that balance. :smile:

1 Like

I personally dislike ghost text as a UI, and I also dislike overlays showing up when I hover over text (like Nova does for documentation).

I bring these up in this discussion to propose a solution to the hover text issue and simultaneously a presumably easier implementation for multi-line code completion that could be an intermediate step towards ghost text for those who wish to enable it. That solution is a keyboard-activated overlay containing styled text — pretty much the documentation hover text UI but triggered by (and dismissible by) a keyboard shortcut. One shortcut could be for docs, and one for code completion. While it’s showing, it would be nice to have some keyboard controls for making the overlay bigger or smaller. Ideally the feature would also be available to plugin authors too.

I’d advocate for even @eahanson’s alternative solution if the ghost text completion is problematic for other reasons. I continue to pay for a Nova license without ever using it. I’ve come to appreciate the augmentation to my workflow that Copilot provides and it’s valuable enough to me that I’m unwilling to give it up, but hold out hope that supporting Nova will eventually allow it to catch up here.

1 Like