- That window seat with the 14” laptop seems extremely claustrophobic.
That’s the real limitation on an economy flight - space rather than power or the internet… at least it would be for me.
The only times I was able to get my laptop out and do some productive work was when I either was sitting in premium economy isle seat with room to spare or when there was an empty seat next to me
- MacBook cable: 94W delivered
Return flight will test this with the correct cable. I expect at least 16% improvement against the 70W cap
Some plane sockets cut out completely if you attempt to draw more then the limit rather then continuing to provide power at the limit.
- This has been exactly my experience too. I've tried multiple harnesses (pi, claude code, codex) with multiple variants of qwen3.6 and gemma4 driven by both o mlx and ollama - and every single time I try to do anything meaningful I end up in a loop. On a 64GB Macbook Pro M3 Max.
I really don't know what the hell people are doing locally, and suspect a lot of the hype around running these models locally is bullshit. Sure, you can make it do something but certainly nothing useful or substantial.
by seattle_spring
1 subcomments
- With more and more flights offering Starlink, I don't see why this would really ever be necessary.
Also, agreed with the other commenters: just read a damn book and take a nap.
- Interesting, I did and document the same kind of experiment a few months ago [1], it looks like so much changed since then!
[1] https://betweentheprompts.com/40000-feet/
- Can’t wait for more people to do the same and eventually getting laptops banned on board due fear of catching fire..
by mumbisChungo
0 subcomment
- >Qwen 4.6 36B
Did the author mean Qwen3.6-27B? Qwen3.6-35B-A3B?
- As much as it's a fun gimmick to run a relatively good sized LLM like qwen 3.6 35B locally, I would much rather have the ability to run it remotely on a piece of hardware I control via VPN session. Much better on battery life and heat. If I'm on an airplane I care about having as much battery life as possible.
Let's say you have a basic setup like llama.cpp and llama-server on a remote server (even if it's just sitting under your home office desk) running a 35GB Q8 quantized model of qwen 3.6 35B, it's not difficult to make llama-server available to your laptop over just about any form of internet connection and VPN.
Having the ability to run that same model locally if you really need to because no internet connection whatsoever is available, but the times that you simultaneously have no internet and a serious need for something the model can output are fairly rare these days.
- Qwen 4.6 36B? Do they mean Qwen3.6-35B-A3B?
by devsecopsify
0 subcomment
- Author here
Great engagement!
Apologies for the sloppinnes in model names. I've been writing it at 1am in Vegas, which was 9am in London (a time my body was operating at) - I was dozing off while finishing it - but excitement about my offline experience kept be _somewhat_ going. I've fixed model naming inconsistencies.
I've updated post with some more info and responses to your comments, I hope you find it useful.
- I don't know if this has been asked here, but is it really possible to connect to a free search service like Google? The point is to use information that's relevant to, say, 2026.
by prettyblocks
0 subcomment
- I've been relatively impressed with local llms, but by far the hardest thing for me is how hot they make my laptop run.
by asimovDev
1 subcomments
- My 14in M3 Max turns the fans way up when running local agentic coding for me to be comfortable with using it in a public place.
- Trying LLM in the air with a 6.200 EUR laptop... Sorry if it's not exactly relatable..
by builderminkyu
1 subcomments
- tried doing exactly this with ollama on a cross-country flight last month. my macbook basically turned into a jet engine and the battery died in under an hour.
curious if you had to heavily throttle the cpu or stick to super small quants (like 4 bit phi3) to actually make it through 10 hours without a power outlet?
- To be honest, I think possibility to work and travel is con rather than perk of current times.
- [dead]
by yjadsfgasdf
0 subcomment
- [dead]
- Can’t you guys just read a book and take a nap?
by itsuckslol
0 subcomment
- [dead]