I want to see what your guys opinions are without bringing slavery into this.
This will be incredibly difficult to source due to the broad nature of the question (which, curiously, sounds like a high school test question), but I'll give you my opinion. I've researched and written on Native American history for years, so you can take it or leave it.
The federal government didn't want to deal with either group. The "wild" natives of the west (ones that hadn't yet been "tamed" by white civilization), were technically at war with the U.S. You can read more on this by researching what the U.S. calls the "Indian Wars". Other natives, that had been previously deemed to be "civilized", had already been relocated to Indian Territory, removed from ancestral lands in the southeastern U.S. Other native bands in the northeast had suffered a literal extinction. To answer your question, however, I'll stick to post-1877 dealings with Native American groups and the U.S. By the time of the massacre of unarmed natives at Wounded Knee, South Dakota, in 1890, the U.S. had basically achieved its primary goal of elimination of any real threat by western Native American tribes, in order to make way for continental commercial trafficking and U.S. expansion/migration. Their collective ways of life had been destroyed. Nomadic hunters and agricultural tribes alike were both forced to rely on government rations of spoiled food and white man's clothing. In short (running out of time, here), the government probably wanted to exterminate the western natives, but settled for keeping them on reservations and perceived permanent dependency upon federal assistance. Honestly, not much has changed in that respect since 1900.
African-Americans, from 1877 to 1890 (and for a few years afterward), experienced mob killings in numbers never before, nor since, seen in the U.S. The federal government probably wanted African-Americans to go away, just as they did the natives, but were not willing to give them the "dignity" of reservation life and being on permanent forced welfare. During Reconstruction, under the control of Congressional Republicans, African-Americans in the South saw some measure of forced success in the U.S., such as being senators, representatives, civil rights legislation, some actual social status (nope, Jackie Robinson was not the first African-American baseball player to play in the white leagues), and so on. After Reconstruction, however, the rights and gains that African-Americans had made were gradually taken away. Congress stopped passing acts aimed at assisting former slaves, the judicial branch even upheld social barriers like "separate but equal", and African-Americans lived in daily fear. So much so, that by the 1940s, a significant percentage of African-Americans migrated to the northern cities, in search of jobs and social equality.
To finish, my opinion is that African-Americans probably "got off better", because they were allowed (thank you, whitey!) to live in U.S. society, albeit a segregated and ultra-violent one toward them, but they could, in theory at least, see some sort of upward mobility within the U.S. Native Americans were not so lucky. There were many non-reservation natives that tried to make it in white America, but none were nearly as successful as African-Americans. I mean, Jim Thorpe wasn't full blooded native, and he only became famous because of his connection to playing college football for a white boarding school for Native Americans (they were badass, by the way). Natives, even in recent times, have had almost zero retaliation for their treatment in the 1800s, while African-Americans have grown to enjoy a cultural explosion that began in the 1920s.
There is plenty of room for argument here, especially as it relates to education (natives were force fed white educational methods/schools while African-American education was ignored and not encouraged) and religion (natives were force fed white religion from the time Columbus stepped his dirty ass foot in the western hemisphere while whites have avoided incorporating African-Americans into traditional white religions, primarily), but this is just my two cents. Hope this helps.
When you say "freedmen", you specifically mean the post-war former slaves, right?