—political activist, author, fugitive, and step-aunt of the famed, slain hip-hop artist Tupac Shakur—was born JoAnne Deborah Bryon on July 16, 1947 in New York City, New York. Following her parents’ divorce in 1950, she moved with her mother and maternal grandparents to Wilmington, North Carolina. Shakur spent much of her adolescence alternating residences between her mother, who remarried and returned to New York, and relatives in Wilmington.

Shakur enrolled in Borough of Manhattan Community College before transferring to City College of New York, where her exposure to Black Nationalist organizations profoundly impacted her activism. Shakur attended meetings held by the Golden Drums, where she met her husband, Louis Chesimard. Members of the organization familiarized her with black historical figures that resisted racial oppression and social violence. She also began interacting with other activist groups and subsequently participated in student rights, anti-Vietnam war, and black liberation movements. In 1971, she adopted a new name: Assata (“she who struggles”) Olugbala (“love for the people”) Shakur (“the thankful”).

During a trip to Oakland, California in 1970, Shakur became acquainted with the Black Panther Party (BPP). She returned to New York City and joined the Harlem branch. Shakur worked in the BPP breakfast program but grew increasingly critical of the BPP because of their reluctance to collaborate with other black organizations.

Shakur left the BPP in 1971 and joined the Black Liberation Army (BLA), which the Federal Bureau of Investigation (FBI) branded an anarchist group. In 1972, the Bureau issued a warrant for her arrest in connection with crimes allegedly committed by the BLA.

On the evening of May 2, 1973, Shakur and two BLA companions were stopped by two state troopers for a traffic infraction on the New Jersey Turnpike, an encounter that ended in the deaths of Assata’s friend Zayd Shakur and State Trooper Werner Foerster. Arraigned on charges that included first-degree murder, Shakur went to trial seven times and was eventually convicted of Trooper Foerster’s murder regardless of her contention that the gunshot wound she sustained during the confrontation partially paralyzed her arm and rendered her incapable of firing a weapon. Despite forensic evidence that supports her assertions, she was found guilty of murder in 1977 and sentenced to life in prison plus 30 years.

In 1979, Shakur escaped from the maximum security unit of the New Jersey Clinton Correctional Facility for Women. She traveled to Cuba in 1984 where she was granted political asylum and reunited with her daughter Kakuya Amala Olugbala, whom she delivered while imprisoned.

In 2013, on the 40th anniversary of Trooper Foerster’s death, the FBI placed Shakur on the Most Wanted Terrorists list, conferring upon her the dubious distinction of being the first woman and the second domestic terrorist to appear on the list. It also increased her bounty to two million dollars.

Shakur continues to live in exile in Cuba. Since her escape, Shakur’s life has been depicted in songs, documentaries and various literary works.

Megathreads and spaces to hang out:

reminders:

  • 💚 You nerds can join specific comms to see posts about all sorts of topics
  • 💙 Hexbear’s algorithm prioritizes comments over upbears
  • 💜 Sorting by new you nerd
  • 🌈 If you ever want to make your own megathread, you can reserve a spot here nerd
  • 🐶 Join the unofficial Hexbear-adjacent Mastodon instance toots.matapacos.dog

Links To Resources (Aid and Theory):

Aid:

Theory:

  • ashinadash [she/her]
    ·
    edit-2
    1 month ago

    Take that is almost twenty years late: the Cell Broadband Engine seems really fuckin stupid. SPEs have two seperate pipelines but no branch prediction, their own tiny lil local isolated memory, linked by a ring bus. What are you supposed to do with it? I guess Guerilla made their rendering deferred really early on so Killzone 2 could offload a bunch of dumb postprocessing with it? What a bad idea in 2006 though. Smh, Sony. Unfunny bit.

    "It's like SMP except dogshit! No you can't just toss threads onto the SPEs, fuck you! 256kb maximum executables within your code! Cry about it, fucker!" - Statements dreamed up by the utterly deranged

    • hexaflexagonbear [he/him]
      ·
      1 month ago

      It's insane how much more expensive a PS3 was to produce and the end product is that if if the developer studied the hardware thoroughly and designed specifically for it, then you'd get performance almost as good as the xbox 360, lol. Crazy part is Sony was considering releasing it without a GPU.

      • ashinadash [she/her]
        ·
        1 month ago

        They woulda been so fucking cooked without the RSX, but also the 7800 GTX was not an awesome choice, lol. The PS3 really goes to show why computers standardised on more general-purpose processors. Goofy ahh little synergistic processing units. Do they have REMOTE SYNERGY???

        • buckykat [none/use name]
          ·
          1 month ago

          The Core 2 Duo came out the same year, multi core architecture for end user devices was still in its infancy.

          • ashinadash [she/her]
            ·
            1 month ago

            brow and the Athlon64 X2 had been out for a year... While it's true that consumer multithreading was still in its infancy, there had been dual processor workstations since at least the Pentium II. And they were gonna release this cluster of goofy, inflexible math churners orchestrated by a hamstrung PowerPC core without a GPU of any sort... With 256k xdram for the entire system? I cannot really see Sony's logic for this. They expected people to rebuild game engines for the PS3.

            • hexaflexagonbear [he/him]
              ·
              edit-2
              1 month ago

              Also isn't the PowePC in the xbox 360 a perfectly fine multicore processor? Like literally same supplier sony used. But, I think the cell processor without a GPU kinda makes sony's rationale a bit more clear. I think they didn't initially envision multicore CPUs as that useful for game logic, but potentially very useful for rendering and post-processing. I think they were initially going for a very early and poorly done APU. I need to go over which computational problems they were useful for, I do remember them accidentally making very good scientific computing machines lol.

              • ashinadash [she/her]
                ·
                1 month ago

                Yeah they're the same core, albeit the PS3's is stripped out and slower in some ways. Both systems would underperform a Pentium 4 for single threads, lmao.

                True they are good for specific parallel tasks, but.. as a gaming machine? Literally why? You are making a games console, Sony.

                • hexaflexagonbear [he/him]
                  ·
                  1 month ago

                  I guess it's partly because those parallel tasks in principle are good for physics and rendering engines. Also, probably Sony was overconfident in their CPU design prowess given the PS2's weird design and developers optimizing specifically for it let it often outperform a console that was like twice as powerful on paper.

                  • ashinadash [she/her]
                    ·
                    1 month ago

                    The PS2 did not often outperform either other platform :^) but generally yeah agreed.

            • buckykat [none/use name]
              ·
              1 month ago

              They were coming off the PS2, of course they expected people to build game engines specifically for their next thing.

    • PaX [comrade/them, they/them]
      ·
      edit-2
      1 month ago

      I feel like the PS3 and Cell are very misunderstood and have received an unfair treatment trump-yassified after the fact, something that has been perpetuated by a lot of different freeze-gamer legends about why many games performed poorly on the PS3 compared to the 360 or on PCs.

      The Cell BE is really an extension and generalization of an idea that Sony applied to great success in the PS2: different highly specialized processors that game developers can use to speed up the most demanding tasks a game has to do like physics, geometry transformation, etc, attached to a general-purpose processor that can handle what's left over and manage all the hardware. Like.... people think it's ridiculous that the PS3 was originally not going to ship with a GPU (in the way that PC freeze-gamers think of a GPU) but the PS2 didn't have an off-the-shelf GPU, the graphics pipeline was completely designed by Sony in a similar way and it worked out very well for them.

      SPEs have two seperate pipelines but no branch prediction

      That was the point, by not including branch prediction hardware you can make the processors run faster. The point of the SPEs was to compute regularly structured data in a massively parallel way. The idea is that the PPE handles all the complicated branching and decision-making and then hands off as much work as possible to the SPEs.

      their own tiny lil local isolated memory, linked by a ring bus. What are you supposed to do with it?

      The SPEs memory is more like a cache. A huge huge bottleneck for computer processors is memory latency. If instead you can guarantee memory accesses will complete within a very short time because the memory is right next to the processor on the die, you can make the processor run a lot faster because you don't have to stall while waiting for data or make complicated circuitry to figure out what instructions can run while waiting for a load or store, etc etc. And they aren't so isolated, you have a fast DMA engine that runs even while the processor is running. If you consider what the SPEs were meant for, as a highly specialized processor that is constantly being fed new data and outputting processed data for the PPE to use, it's really not so bad. It's the same with how much program memory the SPEs have. It's VERY tight by modern standards but the point is that you only need to load a small program that does one specific thing.

      Yeah, it's a very different and sometimes harder way of programming or thinking about a computer system, but the bottlenecks Cell was meant to address exist in all modern computer systems. You can only make a single processor run so fast, which is why we now have multi-processor systems and programs, especially modern video games, everywhere. People who complained back then about having to write games in a completely new way are now writing multi-processor video game engines lol

      Didn't think I would be a Sony defender today lol

      • ashinadash [she/her]
        ·
        1 month ago

        Getting nerded by PaX aubrey-happy

        Uh so the PPE was intended to handle things like Unreal 3 physics and collision then? The PPE itself is not a very cool or fast CPU and the lil IBM data slides I saw did not really indicate offloading stuff to it, Idk. I can see how the SPEs are sort of like a GPU almost? But it's just a really weird way to go about things, near as I can see. Granted there is nothing above my brainstem.

        The thing to me I guess is, "highly specialised, inflexible" and "massively parallel" do not seem like a good fit for games, that's my main takeaway. Plus, vidya would continue to feature mostly atrocious threading (with rare exceptions in Battlefield games, Lost Planet, a few others) until the mid 2010s... You can't really just throw a physics thread onto an SPE the way devs did on 360, right? It'd have to be specialised for the SPEs in particular.

        I guess the thrust is I can see the design logic now (and buckycat was sorta right lol) but it still seems like a bad move at the time, and that it would have been cheaper and easier to just stick four of the Xenon cores into the PS3 or something.

        • PaX [comrade/them, they/them]
          ·
          edit-2
          1 month ago

          I can see how the SPEs are sort of like a GPU almost?

          The thing to me I guess is, "highly specialised, inflexible" and "massively parallel" do not seem like a good fit for games, that's my main takeaway.

          The SPEs are kinda like a GPU. The thing is GPUs are even more specialized and parallel than Cell (which is probably why they had to bolt one on in the end lol). Stuff like graphics or physics math, which are the most computationally demanding parts of a video game just cuz of how much data there is to work, is really really parallelizeable and a lot of the advances in video game graphics can be attributed to having more parallelism (like in a consumer GPU: many many more tiny processors for computing parts of shaders). Apparently the early plans for the Cell BE had 4 PPEs and 32 SPEs which would have been really cool lol, but they scaled down.

          You can't really just throw a physics thread onto an SPE the way devs did on 360, right? It'd have to be specialised for the SPEs in particular.

          Yeahh, that's true. You do kinda have to upend how you were doing everything before and design specifically for an architecture like this.

          that it would have been cheaper and easier to just stick four of the Xenon cores into the PS3 or something.

          It definitely would have been but I respect the ambition tbh. The high performance computing nerds loved these things (remember when people would build PS3 clusters? Cool as hell)

          • ashinadash [she/her]
            ·
            1 month ago

            It's like a GPGPU except it sucks =) and yeah, but GPUs are not also expected to be a CPU. Was Cell tryna do too much at once? 4 PPE/32 SPE would have been ABSURD!

            You do kinda have to upend how you were doing everything before and design specifically

            See I wasn't totally out to lunch!! I dunno if it was reasonable to expect people to just upend their codebases when the groundwork for SMP support in engines was being laid on PC since like 2004, and the PS3 was the only system going that used this darn thing. Especially when the 360 had an extremely strong early showing as a developer friendly (relatively) system.

            I'm really glad I asked that doofy question about the PS3's memory pools now though, deep diving on PS3 architecture has been cool even if I'm not that computer smart. I understand so much more now! Also I guess the homebrew scene must have got a boost from early OtherOS-based development/home distributed computing stuff laying the groundwork. I wonder if emulators offload stuff to SPEs...?

            Also I had forgotten just how bad Unreal 3 was early on! That shit killed studios, you remember?

            • PaX [comrade/them, they/them]
              ·
              1 month ago

              It's like a GPGPU except it sucks =)

              Uhhhh yea, idk I think it's cool lmao

              And they were doing all this stuff before GPU people figured out a coherent way to expose their vector processors to the world (GPGPU via OpenCL or CUDA)

              They maybe were too ambitious lol

              I'm really glad I asked that doofy question about the PS3's memory pools now though, deep diving on PS3 architecture has been cool even if I'm not that computer smart. I understand so much more now!

              Yesss, I'm glad. I didn't know much about Cell before I started reading either, and I still don't know that much tbh blob-no-thoughts

              Also I guess the homebrew scene must have got a boost from early OtherOS-based development/home distributed computing stuff laying the groundwork. I wonder if emulators offload stuff to SPEs...?

              I'm not sure tbh. It seems OtherOS things get access to 6 of the 7 SPEs so it definitely is possible

              Also I had forgotten just how bad Unreal 3 was early on! That shit killed studios, you remember?

              Ooh I actually don't remember tbh. What was so bad about it?

              • ashinadash [she/her]
                ·
                1 month ago

                I mean it's kind of cool but it's not the right thingy for 2006. I guess you gotta shoot your shot as far as leading a computing parallelisation revolution goes... but again, it took devs AGES to split their games out for multi core properly, let alone this specialisation stuff.

                The OpenCL and Distributed stuff made me snort, were they trying to sell this to g*mers or nerds?? Cell is funny but uh lol

                Lemme clip a bit from an IGN article on the history of Unreal

                Having already made great headway in establishing engine-licensing as a major part of their business, Epic was even more ambitious with the release of Unreal Engine 3. While UE2 supported home consoles, it was released several years into their various life cycles, meaning the engine had to adapt to the out-dated hardware. With the third official iteration of the engine Epic had a chance to tailor the experience for the coming generation of HD consoles and harness their power from the outset. In 2005 Epic announced Gears of War as the graphical showpiece for what the Xbox 360 could do using their technology. Epic's Vice President Mark Rein later boasted that he and Sweeney were able to convince Microsoft to double the onboard RAM in the system (from 256MB to 512MB) after seeing the equivalent difference in graphics in a screenshot from Gears.

                Sony was given a timed console exclusive with the PlayStation 3 version of Unreal Tournament 3 to fire a similar sense of graphical wonderment in its partisans. In the build-up to the launch window of each system, there was a steady string of announcements from Epic about new licensees for its nascent tech. Square-Enix, EA, Ubisoft, Disney, THQ, SEGA, Activision, Midway and more signed on for Unreal 3 licenses. Many of the most-anticipated titles in the early days of the 360 were powered by Unreal 3, including Gears of War, John Woo's Stranglehold, Medal of Honor: Airborne, Blacksite: Area 51, and Brothers in Arms: Hell's Highway. It seemed that Unreal Engine 3 might come to define development on high definition consoles.

                But then rumors of development teams struggling with the new tools began to roll in. The big increase in technological capacity made stunning visuals possible, but it also made development significantly more expensive and complicated than it had been. As Epic continued to add updates to its engine in the run-up to launch, developers with milestones to complete often struggled to keep pace. In a post-mortem published in Game Developer magazine, team members from John Woo's Stranglehold lamented their time lost trying to make new versions of the engine work with content made using earlier versions. Then there were reports of the difficulty optimizing the PS3 versions of multi-platform games. After lack-luster receptions for The Last Remnant, Square-Enix president Yoichi Wada said the company's future use of the engine would be made on a "case-by-case basis."

                The rumors all came to a head in mid-2007 when Silicon Knights, which had proudly announced their licensing of UE3 two years earlier, filed a lawsuit against Epic claiming UE3 never did what they had been contractually promised. They declared they would instead have to build their own engine for Too Human, causing major delays and millions of dollars in lost development resources.

                Basically "Gosh HD is hard" combined with buggy tools and poor support for devs, especially Japanese ones. A lot of early UE3 games had big engine problems, Mass Effect, Lost Odyssey. Not a cool engine and it didn't even look that good :^) nice fuckin texture streaming fail, Cliff :^)

                • PaX [comrade/them, they/them]
                  ·
                  edit-2
                  1 month ago

                  The OpenCL and Distributed stuff made me snort, were they trying to sell this to g*mers or nerds?? Cell is funny but uh lol

                  Actually both, Sony, IBM, and Toshiba developed the Cell together for multimedia stuff and for high performance computing

                  IBM was continuing to develop the architecture for many years after (idk if they're still doing it)

                  I guess their expectations for programmers were too high :( which makes sense considering game companies have no interest in new ways of computing, just keeping the slop going out and the money coming in

                  Basically "Gosh HD is hard" combined with buggy tools and poor support for devs, especially Japanese ones. A lot of early UE3 games had big engine problems, Mass Effect, Lost Odyssey. Not a cool engine and it didn't even look that good :^) nice fuckin texture streaming fail, Cliff :^)

                  Ohh I see lmao

                  I wasn't around for all that drama but it seems like a mess

                  • ashinadash [she/her]
                    ·
                    1 month ago

                    That's so dorky, there wasn't a huuuuge amount of crossover there. Not a focused enough gamer machine imo.

                    Were they? In my brain the Cell BE is the last gasping thrash of anything that isn't i386 or ARM, huh.

                    yea g*merbrain extends to programmers too, not many programming socks 'round back then

                    All I remember is UE3 games were ugly and buggy =) Epic should have sank on the back of that imo, but when you make the only freely licensed middleware engine for the hardware everyone else finds really difficult... (save me, RenderWare... RenderWare save me)