A common use-case for me is to use Excel as a UI as follows:
1 data-entry of input in worksheet with worksheet-scoped, well-known names.
2 press menu-item for a particular operation
3 gather up all necessary inputs (via the well-known range names) and serialize values as a blob of YAML/JSON
4 run a command remotely (on a Linux host/cluster) on the serialized blob and retrieve results (usually a dictionary) as a blob of YAML/JSON
5 'distribute' the results to Excel (keys of dictionary correspond to Excel named ranges
step 4 can use any transport (I use ssh (via plink) and HTTP) but when the remote end is also a Python program, it may be convenient to remove the explicit serialization step and have whatever Python glue you are using do the work transparently. Some candidates:
(the newly packages IPython parallel stuff)
3) IPython/Jupyter client (Tony has already contribute an embedded IPython *kernel* within Excel; I am talking about the other direction)
4) Rpyc etc.
5) cloud stuff (don't know much about this)
One big potential advantage I see to using the ipyparallel stuff is that you get a nice abstraction with the directview to allow for parallel function application (either sync or async, load-balanced or round-robin). I think it would fit in nicely into Excel.
I am interested in exploring APIs to remoting in a flexible way that plays nicely with Excel UDFs and 'background' operations.
plink -batch -ssh remote-host.somecompany.com ipcluster start --profile=remoting_fun --cluster=excel --n=4
#Here is the client code running within pyxll
f = '/some/path/which/works/on/windows/.ipython/profile_remoting_fun/security/ipcontroller-excel-client.json'
c = Client(f)
v = c.direct_view()