Conversation
18b49a8 to
d074327
Compare
that's out of tree because you start mini-coi with |
|
|
||
| __export__ = ["add", "multiply", "get_message"] | ||
| def dijkstra_path(g, a, b): | ||
| from networkx import from_dict_of_dicts, dijkstra_path |
There was a problem hiding this comment.
in MicroPython and PyScript, this cannot be just part of the requirements.txt, it has to be a Pure Python package MicroPython can import and run without issues and I believe that's not the case ... or is it?
otherwise I think you meant to use Pyodide py as type instead, but I am guessing at this point ... still that package needs to be available for each worker, if not part of the stdlib each runtime has.
There was a problem hiding this comment.
NetworkX is in fact a pure Python package.
There was a problem hiding this comment.
The fact that the virtual filesystem is unique per-interpreter is surprising. The documentation should call attention to that.
There was a problem hiding this comment.
it's in WASM memory and shared with the environment that bootstraps it, no surprise there but also no way to have a unique shared VFS because these runtimes bundles Emscripten VFS anyway and each of them also has a different logic/tree per VFS (Pyodide VS MicroPython VS ... any other PL, where /tmp or /packages might or might not make any sense, together with symlinks and whatnot).
There was a problem hiding this comment.
Well, it's surprising when I'm coming from desktop Python, where subprocesses don't share live objects but do share the filesystem. And I think a lot of your users will be coming from desktop Python.
There was a problem hiding this comment.
No need to apologise - any feedback on docs is very welcome and thank you for the suggestions. Honestly, if folks like you (actual real users!) don't tell people like me (the dude who wrote the docs!) what's missing in the docs then things won't improve. If in doubt, just give feedback, it'll always be welcome! 🚀
There was a problem hiding this comment.
Workers have separate memory spaces. Each worker has its own memory, and you cannot share objects between workers or with the main thread. All communication happens via function calls with serialised data.
This paragraph also describes the situation with multiprocess parallelism on desktop, which misled me into thinking I could assume it was like multiprocessing in other ways; in particular, that I could use files to communicate between workers. Perhaps this would be better:
Each worker is a separate Python interpreter, in a separate memory space, with a separate filesystem. You cannot share objects between workers, nor with the main thread. All communication between them happens via function calls with serialised data.
There was a problem hiding this comment.
I'm doing some docs revisions today. I'll incorporate this clarification. Thank you!
There was a problem hiding this comment.
FWIW @clayote - if you ever want a "live" technical discussion we have a fortnightly community technical call in which we chat about deeply technical things. The details are in the "events" section of our PyScript discord server (and we use discord as the platform for the call). Feel free to add an item to the agenda when I announce it in the #chat channel.
There was a problem hiding this comment.
I'm in New Zealand, where your technical calls happen at 4am. I don't think I can make it very often...
| pre-commit==3.7.1 | ||
| python-minifier==3.1.0 | ||
| setuptools==72.1.0 | ||
| networkx~=3.6 |
There was a problem hiding this comment.
this does not affect Pyodide or MicroPython runtimes, you need to specify a config with a packages=['networkx'] for both runtimes, or even JSON, but it has to be usable from both Pyodide and MicroPython and I am not sure the latter would understand networkx package
There was a problem hiding this comment.
i.e.
await create_named_worker(src="./worker_functions.py", name="mpy-worker0", type="py", config='{"packages":["networkx"]}')note I've changed mpy to py too as I think MicroPython doesn't have the ability to run networkx while Pyodide does
Then, when I tried this approach in pyscript.com, this error is the JSON it's failing to load? I used only Pyodide there.
|
|
Well, whatever, I'll switch to Pyodide workers for the time being, and add another, similar micropython test later. I can run the pure-Python tests now! I guess it's just a documentation issue, and a regrettably misleading error message. However... I put in an explicit dependency on the |
|
I'm afraid https://packages.pyscript.net/package/?package=random is not available ??? is that a native Python thing? if so, shouldn't it be to clarify, non Python stdlib things should be in packages, packages that are in stdlib should not be part of packages ... they are not dependencies, they are core. |
Turns out, requirements.txt has nothing to do with what gets installed in PyScript
|
There wasn't really a reason for me to use my own, just habit It's weird that I can't `from random import Random` though...
|
Oh, I think we may have stumbled into a compatibility issue between PyScript and Firefox, as well, because the output of your code on Firefox is different: And then it hangs forever. |
|
And I'm also getting different output from you when I fork your fork of parallel-pathfinding in Chromium! It's the Linux Mint build of Chromium, which has some customizations, I suppose... |
d56f8a3 to
25ca75a
Compare
Well, it will, once I get the package.json right...
|
I think I've gotten networkx to install in micropython? At least, now the test hangs, rather than giving me an error in mip... |
|
When I forked your project on pyscript.com, it resulted in this project that, when run, gives me different output, though I haven't changed the code at all. Not sure what that's about. |
Oh, it didn't hang! It just took longer than I expected to grab all those files. Here's the new failure I'm looking at: I probably have to add that dependency in the |
Turns out, requirements.txt has nothing to do with what gets installed in PyScript
There wasn't really a reason for me to use my own, just habit It's weird that I can't `from random import Random` though...
Well, it will, once I get the package.json right...
for more information, see https://pre-commit.ci
The functions I want aren't there in micropython, apparently
One of those graphs is too small for all three.
…rallel` So I could do four workers now if I wanted
… in workers per se
# Conflicts: # core/tests/python/tests/test_workers.py



Description
Here's a test I wrote that should, in principle, verify that
I can't get it to run, though. Opening index.html directly gets me CORS errors; running
npx mini-coi -p 8000 tests/in thecoredirectory gets mein the Firefox console when I go to localhost:8000. In Chromium, I at least see the page to navigate to specific tests, but when I click the bottom option, python, I get an empty page with some 404 errors in the console. (I ran make build beforehand. Those files, core.css and core.js, ought to be there.)
Changes
Checklist
make buildworks locally.