And then when I do get past he password, it sends a OTP to a Mac Mini I never use and have to tap around to get it to generate a SMS code. No option for external TOTP, and no way to remove the Mac Mini I don't use from OTP without signing out of it.
Google also gives me a ton of issues with having multiple accounts. Go to calendar app with account 2, switch to desktop mode so I can actually click on the meeting invite, now Im logged back into account 1. Similar issues trying to use any other google service and have to use
I don't understand how these kind of things aren't priority #1
None of these issues on my other profiles or in incognito.
It only works in incognito because it’s using a different ip address there…
That sounds like a good magic trick.
I had nor even heard of app stores before then IIRC unless you count Linux repos.
So PWAs would have been more than fine but, unfortunately, that ship has long since sailed, and Apple make way too much money out of the app store for a course change.
If it’s only mean old Apple, where are all of the great Android PWAs and why do developers decide to make native Android apps?
Once hybrid became possible it was immediately clear that it was the easiest way to get a decent quality app deployed on both iOS and Android. It was a big enough deal that around the time I attended that VSIP event and then PhoneGap Europe, or perhaps shortly afterward, some backlash against hybrid started off with a few big companies trumpeting about how they'd started off native, gone to hybrid for a few years, and were now going back to native again (principally for native experience and performance reasons).
But I think the pressure has always been in the hybrid direction, particularly if you're resource or budget constrained and need to target both platforms, or the web is your main platform (whether than be mobile or desktop). I'm sure the Epic vs Apple fight didn't do any harm, but I don't know what real difference it's made.
The reality is that maintaining two native apps plus a web app is a pain in the ass, especially when you realise Swift - whilst a good language - is a wrapper over some decidedly tedious APIs and a lot of Objective C legacy that you probably don't want sucking up a lot of time. If you want/need apps, it's so much easier to stick a native wrapper around a responsive web app, and that will work well for so many use cases. Not all, by any means, but most SaaS, LOB, or CRUDy apps will do fine as hybrid.
https://thenewstack.io/50-years-ago-a-young-bill-gates-took-...
Perens had accepted a position as senior Linux/Open Source Global Strategist for Hewlett-Packard, which he describes as leaving Apple “to work on Open Source. So I asked Steve: ‘You still don’t believe in this Linux stuff, do you?'” And Perens still remembers how Steve Jobs had responded.
“I’ve had a lot to do with building two of the world’s three great operating systems” — which Jobs considered to be NeXT OS, MacOS and Windows. “‘And it took a billion-dollar lab to make each one. So no, I don’t think you can do this.'”
Perens says he later "won that argument" when Jobs stood onstage in front of a slide that said ‘Open Source: We Think It’s Great!’ as he introduced the Safari browser."
While yes some software have come in that format, it took the big 3 to push the server Linux based clouds, Google to push it on phone, tablets and laptops and now Steam to make a push for the average gamer.
This is not to discredit the work being done outside those lab's which very much build on the work for free or by foundations, however the first versions just don't capture a majority of the available markets which the OSes Jobs mention very much did and the others by the billion dollar labs since.
What has been shown is that it takes billions of dollars to market an OS to the general public.
Doesnt really sound like Jobs was putting up much of a fight there.
Before Apple’s App Store launched, my iPhone was running all sorts of other apps and alternative launchers.
Apple had to move fast to keep things from getting too out of control.
Over the years, as the vulnerabilities in the OS were closed and iOS added features, the need or desire to bother with jailbreaks and 3rd party pirate app stores dropped. I haven’t thought about it in many years.
I avoid apps as much as possible due to all the nefarious tricks they play, even with all the sandboxing and review they go through. Without those constraints, I can't imagine the hell that we'd be in.
But sometimes people like to do stuff like configure their QMK keyboards or load new firmware for their EdgeTX drone radios or make bootable USB sticks, all tasks that work just fine in easily deployed PWAs on every client platform in existence, except iOS.
For small developers of small-yet-oddball clients apps, PWA's are an absolutely magnificent platform. Write once, deploy once, run... everywhere-but-an-iPhone. It really sucks that Apple's devices are crippled like this.
Edit to reply to this bit:
> Without those constraints, I can't imagine the hell that we'd be in.
Again, that hell is literally every other platform on the planet. It's only Safari that is "protected". In point of fact browser permissions management on this stuff tends strongly to be stricter and less permissive than app permissions, which are much less visible.
I really, really, love building stuff for/on the web. When working with founders/clients we'd often start with building the MVP as a PWA, because of how easy it is to iterate and test. (https://untested.sonnet.io/notes/web-and-feedback-loops/)
That said, some reasons off the top of my head in random order:
- seemingly small but UX critical features breaking or not working at all (wake, audio, notifications, scroll breaking).
- most of the users don't know/haven't been taught they can install a site or assume that PWAs are inherently worse
- PWAs are harder to monetise (no super easy way to let the user pay a lifetime licence for the app, customers want super easy, and that's not for me to judge)
- critical, but non-obvious to a non-technical person (and thus difficult to explain) features are unstable or janky on iOS when running standalone/via home screen (example: wiping offline storage every few days).
In some ways things work better than, say 10 years ago, but at the same time there's the *unpredictability*. I really don't want to worry about my app breaking in some impossible to fix way next year. Not, when the app is meant to pay my rent.
Performance was rarely an issue, discounting experiments like running image recognition inside a "service worker" in JS, on iPhone 7 for an AR game I was messing with. That was in 2016 (before Pokemon Go came out and kind... of dumbed down the idea or AR).
Apple neutered the web as best they could to force you to use their rails.
I'm still angry they killed flash. There has never been a better platform for non-technical folks, kids especially, to make animation, games, and mini apps, and deploy them as single binary blobs.
A single swf file could be kept and run anywhere. For the younger generation: imagine right clicking to download a YouTube video or a video game you'd see on itch.io. And you could send those to friends.
You could even embed online multiplayer and chatrooms into the apps. It all just worked. What we have now is a soup of complexity that can't even match the feature set.
During the Flash era, creativity flourished. It was accessible, too. Seven year olds could use it.
Flash was getting better and better. It could have become an open standard had Jobs not murdered it to keep runtimes off iPhone. He was worried about competition. The battery and security issues were technical problems and fully solvable.
The companies that filled the web void - Google and Apple - both had their own selfish reasons not to propose a successor. And they haven't helped anyone else step up to the plate. It would be impossible now.
Imagine if apps for mobile could be deployed via swf. We'd have billions of apps, and you could just tap to download them from the web.
Smartphones might have pushed us forward, but the app layer held us back.
The 1990s and 2000s web saw what AOL and Microsoft were trying to lock us into and instead opted for open and flexible.
Platformization locked us into hyperscaler rails where they get action on everything we do. This has slowed us down tremendously, and a lot of the free energy and innovation capital of the system goes to taxation.
But the creation tools and the culture never really lined up the same way, and developers focused on creating apps instead.
For non-games, HTML has always been technically superior. iOS Safari may have a long history of rendering bugs, but it beats Flash/AIR, which always looked very out-of-place even on desktop.
I do wonder what would have happened in an alternate universe where either Flash or HTML5 took off on mobile instead of apps. We would have both the upsides of openness, and the downsides of worse performance and platform integration and the lack of an easy payment rail. Pretty much the same situation we still see on desktop today.
We wouldn't have had the same "gold rush" from the early App Store, which happened in large part because of the ease of making money. There would probably be more focus on free stuff with ads, like Android but more so.
(I know I'm mixing different levels here, and my personal experience isn't really an argument).
ps: HTML scope is way more advanced than whatever Flash could have been.
No they wouldn't. We've forgotten just how bad and sloppy flash apps were. The handful of companies that used Adobe Flex turned out awful POS that barely worked. It occupied the same space that Electron does today -- bloated, slow, and permitting cheap-ass devs to utilize cheap talent to develop 'apps' with all the finesse of a sledgehammer
As a kid I loved flash, I was making interactive apps in AS2/3 in high school. But I watched in horror as it became the de facto platform for crapware
This. Except Electron crap at least runs on top of a well-designed and relatively reliable platform (HTML/Chromium) - and sometimes the crap even offer an actual PWA version with all the sandbox benefits a real browser has to offer. Flash didn't even had that.
And let's be realistic, there will always be demand for a crap-running platform for vendors that don't care (or just have their core values elsewhere).
My kingdom for some way of gatekeeping platforms so that entities like this are forbidden from participating
- Lack of gatekeeping was THE advantage that made Web viable and competitive against traditional media.
- You can't gatekeep crapmakers without also gatekeeping that kid in his parent's basement with an awesome idea.
- Crapmakers with enough money will punch through any gatekeeping.
- Sometimes you have to accept that vendors don't care. Can't expect a transport company to give too much love to their timetables app. Yes, they are expected to hire someone competent to do it, but the "someone competent" also rarely care. Still better than having no access to the timetables.
Unfortunately every peabrained enterpreneur saw that and began eroding the moat until it was gone. The knowledge required to build things has been on a steady decline, and now with AI that decline has completely destroyed it. Now, every fucking hack with an "idea" is not only able to act on them but now they act like they are as good as the people who paid a heavy price to get to the same level through years of study and hard work.
Seriously? Is that why I ran all my desktop browsers with flashblock even before the iPhone was out?
Dare to tell me Adobe was feverishly working in secret on reducing pointless CPU usage and saving my battery?
> "Ryan Lawler of TechCrunch wrote in 2012 "Jobs was right", adding Android users had poor experiences with watching Flash content and interactive Flash experiences were "often wonky or didn't perform well, even on high-powered phones".[9] Mike Isaac of Wired wrote in 2011 that "In [our] testing of multiple Flash-compatible devices, choppiness and browser crashes were common", and a former Adobe employee stated "Flash is a resource hog [...] It's a battery drain, and it's unreliable on mobile web browsers".[10] Kyle Wagner of Gizmodo wrote in 2011 that "Adobe was never really able to smooth over performance, battery, and security issues".[11]" - https://en.wikipedia.org/wiki/Thoughts_on_Flash
[1] https://www.palminfocenter.com/news/9692/palm-joins-adobe-fl...
[2] https://old.reddit.com/r/Palm/comments/ere0c/how_does_flash_...
I was stoked to watch Apple nail the coffin shut, and see it consigned to history along with Java applets.
Capcut and Roblox would like words. No, that's kinda just wrong. Content generation for non-technical folks has never been easier or more effective. Flash is just something nerds here remember fondly because it was a gateway drug into hackerdom. Some of us are older and might feel the same way about Hypercard or TurboPascal or whatnot.
But flash specifically deserved to die.
Maybe one day we'll see a JS/WASM framework that is just as portable.
They learned this much later after learning the game from Meta, Google, and Apple.
The first iPhone came with 128MB RAM with a 400Mhz CPU, it couldn’t even run Safari smoothly. If you scrolled too fast, you would get a checkerboard while you waited on the page to render. An iPhone with those specs didn’t come out until 2011.
Adobe was always making promises it couldn’t keep. The Motorola Xoom was suppose to be the “iPad Killer” that could run Flash , Adobe was late leaving the Xoom in the unenviable position that you couldn’t go to the Xoom home page on the Xoom at launch because it required Flash.
?