Getting Random Wikipedia Page




Hi.
I'm making a little program that chooses for you the name of your band, the name of the first album, and a picture to use as the album cover.
Currently, I can get a random picture from flickr for the album cover, and I can get a random quote from a quotes site and chop it up to use as the album name.
The problem is I want to get the title of a random article in wikipedia as the band name. The problem is, whenever I try to 'file_get_contents($url)' a wikipedia page, I get this error:

C

Related to : Getting Random Wikipedia Page
Loading a Wikipedia page
Programming Languages

I want to make a button in an UIAlertView that opens a Wikipedia page, with a subject stored in my array "array"


Here is how I'm doing it.


Wikipedia follows the format of http://en.wikipedia.org/wiki/<subject>. In my array, I have text entries of subjects. I want it to open in mobile Safari when tapped. So far, no luck :(


Help please. Any insight would be appreciated.


-(void)alertView:(UIAlertView *)alertView clickedButtonAtIndex:(NSInteger)buttonIndex{
if (buttonIndex == 1) {
NSString *myString = [[NSString alloc] init];
myString = [array
How to get all URLs in a Wikipedia page
Programming Languages

It seems like Wikipedia API's definition of a link is different from URL? I'm trying to use the API to return all the urls in a specific wiki page.


I have been playing around with this query that I found from this page under generators and redirects.


How to get Titles from a Wikipedia Page
Programming Languages

Is there a direct API call where I can get titles from a wikipedia page.


For e.g. from http://en.wikipedia.org/wiki/Chicago, I want to retrieve the following:


1 History
 1.1 Rapid growth and development
 1.2 20th and 21st centuries
2 Geography
 2.1 Topography
 2.2 Climate
3 Cityscape
 3.1 Architecture
so on -----------


I have looked at http://www.mediawiki.org/wiki/API:Lists/All, but couldn't find an action which gives me above list from a wiki page.


Full urls of images of a given page on Wikipedia (only those I see on the page)
Programming Languages

I'd want to extract all full urls of images of "Google"'s page on Wikipedia


I have tried with:


http://en.wikipedia.org/w/api.php?action=query&titles=Google&generator=images&gimlimit=10&prop=imageinfo&iiprop=url|dimensions|mime&format=json

but, in this way, I got also not google-related images, such as:


http://upload.wikimedia.org/wikipedia/en/a/a4/Flag_of_the_United_States.svg
http://upload.wikimedia.org/wikipedia/en/4/4a/Commons-logo.svg
http://upload.wikimedia.org/wikipedia/en/4/4a/Commons-logo.svg
http://upload.wikimedia.org/wikipedia/commons/f/fe/Cr
wikipedia page-to-page links by pageid
Programming Languages

What?:
I'm trying to get page-to-page link map (matrix) of wikipedia pages by page_id in following format:


from1 to1 to2 to3 ...
from2 to1 to2 to3 ...
...

Why?:
I'm looking for data set (pages from wikipedia) to try out PageRank.


Problem:
At dumps.wikimedia.org it is possible to download pages-articles.xml which is XML with this kind of format:


<page>
<title>...</title>
<id>...</id> // pageid
<text>...</text>
</page>

that I w

Getting Random Wikipedia Page
Programming Languages

Hi.
I'm making a little program that chooses for you the name of your band, the name of the first album, and a picture to use as the album cover.
Currently, I can get a random picture from flickr for the album cover, and I can get a random quote from a quotes site and chop it up to use as the album name.
The problem is I want to get the title of a random article in wikipedia as the band name. The problem is, whenever I try to 'file_get_contents($url)' a wikipedia page, I get this error:

C

HD Wallpapers
3D
3D Abstract
City
Celebrities
Indian Celebrities
Cars
Travel
Girls
Animals Birds
Movies
Sports
Black White
Nature
Planes
Anime
Food Drink
Dreamy Fantasy
Architecture
Games
Space
Holidays
Flowers
Love
Artistic
Baby
Beach
Bikes Motorcycles
Macro
Computers
Vector
Funny
Army
Textures
Brands
Misc
Music
Other
Privacy Policy - Copyrights Notice - Feedback - Report Violation - RSS 2017 © bighow.org All Rights Reserved .