—political activist, author, fugitive, and step-aunt of the famed, slain hip-hop artist Tupac Shakur—was born JoAnne Deborah Bryon on July 16, 1947 in New York City, New York. Following her parents’ divorce in 1950, she moved with her mother and maternal grandparents to Wilmington, North Carolina. Shakur spent much of her adolescence alternating residences between her mother, who remarried and returned to New York, and relatives in Wilmington.

Shakur enrolled in Borough of Manhattan Community College before transferring to City College of New York, where her exposure to Black Nationalist organizations profoundly impacted her activism. Shakur attended meetings held by the Golden Drums, where she met her husband, Louis Chesimard. Members of the organization familiarized her with black historical figures that resisted racial oppression and social violence. She also began interacting with other activist groups and subsequently participated in student rights, anti-Vietnam war, and black liberation movements. In 1971, she adopted a new name: Assata (“she who struggles”) Olugbala (“love for the people”) Shakur (“the thankful”).

During a trip to Oakland, California in 1970, Shakur became acquainted with the Black Panther Party (BPP). She returned to New York City and joined the Harlem branch. Shakur worked in the BPP breakfast program but grew increasingly critical of the BPP because of their reluctance to collaborate with other black organizations.

Shakur left the BPP in 1971 and joined the Black Liberation Army (BLA), which the Federal Bureau of Investigation (FBI) branded an anarchist group. In 1972, the Bureau issued a warrant for her arrest in connection with crimes allegedly committed by the BLA.

On the evening of May 2, 1973, Shakur and two BLA companions were stopped by two state troopers for a traffic infraction on the New Jersey Turnpike, an encounter that ended in the deaths of Assata’s friend Zayd Shakur and State Trooper Werner Foerster. Arraigned on charges that included first-degree murder, Shakur went to trial seven times and was eventually convicted of Trooper Foerster’s murder regardless of her contention that the gunshot wound she sustained during the confrontation partially paralyzed her arm and rendered her incapable of firing a weapon. Despite forensic evidence that supports her assertions, she was found guilty of murder in 1977 and sentenced to life in prison plus 30 years.

In 1979, Shakur escaped from the maximum security unit of the New Jersey Clinton Correctional Facility for Women. She traveled to Cuba in 1984 where she was granted political asylum and reunited with her daughter Kakuya Amala Olugbala, whom she delivered while imprisoned.

In 2013, on the 40th anniversary of Trooper Foerster’s death, the FBI placed Shakur on the Most Wanted Terrorists list, conferring upon her the dubious distinction of being the first woman and the second domestic terrorist to appear on the list. It also increased her bounty to two million dollars.

Shakur continues to live in exile in Cuba. Since her escape, Shakur’s life has been depicted in songs, documentaries and various literary works.

Megathreads and spaces to hang out:

reminders:

  • 💚 You nerds can join specific comms to see posts about all sorts of topics
  • 💙 Hexbear’s algorithm prioritizes comments over upbears
  • 💜 Sorting by new you nerd
  • 🌈 If you ever want to make your own megathread, you can reserve a spot here nerd
  • 🐶 Join the unofficial Hexbear-adjacent Mastodon instance toots.matapacos.dog

Links To Resources (Aid and Theory):

Aid:

Theory:

  • ashinadash [she/her]
    ·
    2 months ago

    It's like a GPGPU except it sucks =) and yeah, but GPUs are not also expected to be a CPU. Was Cell tryna do too much at once? 4 PPE/32 SPE would have been ABSURD!

    You do kinda have to upend how you were doing everything before and design specifically

    See I wasn't totally out to lunch!! I dunno if it was reasonable to expect people to just upend their codebases when the groundwork for SMP support in engines was being laid on PC since like 2004, and the PS3 was the only system going that used this darn thing. Especially when the 360 had an extremely strong early showing as a developer friendly (relatively) system.

    I'm really glad I asked that doofy question about the PS3's memory pools now though, deep diving on PS3 architecture has been cool even if I'm not that computer smart. I understand so much more now! Also I guess the homebrew scene must have got a boost from early OtherOS-based development/home distributed computing stuff laying the groundwork. I wonder if emulators offload stuff to SPEs...?

    Also I had forgotten just how bad Unreal 3 was early on! That shit killed studios, you remember?

    • PaX [comrade/them, they/them]
      ·
      2 months ago

      It's like a GPGPU except it sucks =)

      Uhhhh yea, idk I think it's cool lmao

      And they were doing all this stuff before GPU people figured out a coherent way to expose their vector processors to the world (GPGPU via OpenCL or CUDA)

      They maybe were too ambitious lol

      I'm really glad I asked that doofy question about the PS3's memory pools now though, deep diving on PS3 architecture has been cool even if I'm not that computer smart. I understand so much more now!

      Yesss, I'm glad. I didn't know much about Cell before I started reading either, and I still don't know that much tbh blob-no-thoughts

      Also I guess the homebrew scene must have got a boost from early OtherOS-based development/home distributed computing stuff laying the groundwork. I wonder if emulators offload stuff to SPEs...?

      I'm not sure tbh. It seems OtherOS things get access to 6 of the 7 SPEs so it definitely is possible

      Also I had forgotten just how bad Unreal 3 was early on! That shit killed studios, you remember?

      Ooh I actually don't remember tbh. What was so bad about it?

      • ashinadash [she/her]
        ·
        2 months ago

        I mean it's kind of cool but it's not the right thingy for 2006. I guess you gotta shoot your shot as far as leading a computing parallelisation revolution goes... but again, it took devs AGES to split their games out for multi core properly, let alone this specialisation stuff.

        The OpenCL and Distributed stuff made me snort, were they trying to sell this to g*mers or nerds?? Cell is funny but uh lol

        Lemme clip a bit from an IGN article on the history of Unreal

        Having already made great headway in establishing engine-licensing as a major part of their business, Epic was even more ambitious with the release of Unreal Engine 3. While UE2 supported home consoles, it was released several years into their various life cycles, meaning the engine had to adapt to the out-dated hardware. With the third official iteration of the engine Epic had a chance to tailor the experience for the coming generation of HD consoles and harness their power from the outset. In 2005 Epic announced Gears of War as the graphical showpiece for what the Xbox 360 could do using their technology. Epic's Vice President Mark Rein later boasted that he and Sweeney were able to convince Microsoft to double the onboard RAM in the system (from 256MB to 512MB) after seeing the equivalent difference in graphics in a screenshot from Gears.

        Sony was given a timed console exclusive with the PlayStation 3 version of Unreal Tournament 3 to fire a similar sense of graphical wonderment in its partisans. In the build-up to the launch window of each system, there was a steady string of announcements from Epic about new licensees for its nascent tech. Square-Enix, EA, Ubisoft, Disney, THQ, SEGA, Activision, Midway and more signed on for Unreal 3 licenses. Many of the most-anticipated titles in the early days of the 360 were powered by Unreal 3, including Gears of War, John Woo's Stranglehold, Medal of Honor: Airborne, Blacksite: Area 51, and Brothers in Arms: Hell's Highway. It seemed that Unreal Engine 3 might come to define development on high definition consoles.

        But then rumors of development teams struggling with the new tools began to roll in. The big increase in technological capacity made stunning visuals possible, but it also made development significantly more expensive and complicated than it had been. As Epic continued to add updates to its engine in the run-up to launch, developers with milestones to complete often struggled to keep pace. In a post-mortem published in Game Developer magazine, team members from John Woo's Stranglehold lamented their time lost trying to make new versions of the engine work with content made using earlier versions. Then there were reports of the difficulty optimizing the PS3 versions of multi-platform games. After lack-luster receptions for The Last Remnant, Square-Enix president Yoichi Wada said the company's future use of the engine would be made on a "case-by-case basis."

        The rumors all came to a head in mid-2007 when Silicon Knights, which had proudly announced their licensing of UE3 two years earlier, filed a lawsuit against Epic claiming UE3 never did what they had been contractually promised. They declared they would instead have to build their own engine for Too Human, causing major delays and millions of dollars in lost development resources.

        Basically "Gosh HD is hard" combined with buggy tools and poor support for devs, especially Japanese ones. A lot of early UE3 games had big engine problems, Mass Effect, Lost Odyssey. Not a cool engine and it didn't even look that good :^) nice fuckin texture streaming fail, Cliff :^)

        • PaX [comrade/them, they/them]
          ·
          edit-2
          2 months ago

          The OpenCL and Distributed stuff made me snort, were they trying to sell this to g*mers or nerds?? Cell is funny but uh lol

          Actually both, Sony, IBM, and Toshiba developed the Cell together for multimedia stuff and for high performance computing

          IBM was continuing to develop the architecture for many years after (idk if they're still doing it)

          I guess their expectations for programmers were too high :( which makes sense considering game companies have no interest in new ways of computing, just keeping the slop going out and the money coming in

          Basically "Gosh HD is hard" combined with buggy tools and poor support for devs, especially Japanese ones. A lot of early UE3 games had big engine problems, Mass Effect, Lost Odyssey. Not a cool engine and it didn't even look that good :^) nice fuckin texture streaming fail, Cliff :^)

          Ohh I see lmao

          I wasn't around for all that drama but it seems like a mess

          • ashinadash [she/her]
            ·
            2 months ago

            That's so dorky, there wasn't a huuuuge amount of crossover there. Not a focused enough gamer machine imo.

            Were they? In my brain the Cell BE is the last gasping thrash of anything that isn't i386 or ARM, huh.

            yea g*merbrain extends to programmers too, not many programming socks 'round back then

            All I remember is UE3 games were ugly and buggy =) Epic should have sank on the back of that imo, but when you make the only freely licensed middleware engine for the hardware everyone else finds really difficult... (save me, RenderWare... RenderWare save me)