perl.libwww
  Home FAQ Contact Sign in
perl.libwww only
 
Advanced search
January 2010
motuwethfrsasuw
    123 53
45678910 1
11121314151617 2
18192021222324 3
25262728293031 4
2010
 Jan   Feb   Mar   Apr 
 May   Jun   Jul   Aug 
 Sep   Oct   Nov   Dec 
2010 2008 2007 2006
total
perl.libwww Profile…
RELATED GROUPS

POPULAR GROUPS

more...

 Up
  Re: installation of frameready-1.020 failed         


Author: Yun-an Yan
Date: Jan 29, 2010 00:45

Mark,

Thank you so much for your reply. I will do that.

Best wishes,
Yun-an
>Yun-an,
>
>The test failed was a live test, meaning it ran against a live website
>and could have failed for a reason caused by the server rather than
>your software. It is probably save to ignore. You can just for a "make
>install" to skip it.
>
> Mark
no comments
  Re: Need help for the error "301 moved permanently"!         


Author: Mark Stosberg
Date: Jan 28, 2010 19:33

On Thu, 28 Jan 2010 15:51:47 +0530
bipin Nayak gmail.com> wrote:
> Thanks for adding me to this group.
>
> Following is the script and result I am getting:-

Bipin,

I tried the script as you gave it and it gave the source of the login
page as the result, *not* a 301. Are you using the latest versions of
LWP::UserAgent and WWW::Mechanize? I just downloaded the latest
versions from CPAN.

Mark

no comments
  Re: installation of frameready-1.020 failed         


Author: Mark Stosberg
Date: Jan 28, 2010 19:22

On Wed, 27 Jan 2010 12:47:39 +0100
Yun-an Yan wrote:
> Dear All,
>
> I cannot pass the test when I try to install Frameready-1.020.
> Would somebody please help me?

Yun-an,

The test failed was a live test, meaning it ran against a live website
and could have failed for a reason caused by the server rather than
your software. It is probably save to ignore. You can just for a "make
install" to skip it.

Mark

no comments
  Silent removed parameters when posting UTF8         


Author: Bill Moseley
Date: Sep 18, 2008 22:54

I have a form that posts to a service, and noticed not all
parameters were being posted.

I realized my mistake of not encoding to utf8, but I'm curious why
this did not generate any warnings.

I can imagine others getting caught by this, so a warning would be very
helpful.

Not really sure where it should be checked -- in query_form when
processing the individual parameters, I suspect, but the damage seems
to happen when $uri->query is called:

$q =~ s/([^$URI::uric])/$URI::Escape::escapes{$1}/go;

Would it not be better to issue a waring?

use HTTP::Request::Common;
use strict;
use warnings;
use Data::Dumper;
use Encode;
Show full article (1.34Kb)
5 Comments
  Using LWP module in perl with Crypt::SSLeay         


Author: Sanket Kathalkar
Date: Sep 18, 2008 05:22

Hi,

I am going to use LWP module for accessing urls which are SSL
enabled. Is Crypt::SSLeay required for this? Give me the steps to
impement this. Do I need to create a certificate to access a remote
server?

Thanks and Regards,

Sanket Kathalkar
1 Comment
  LWP content encode         


Author: Stefano Tacconi
Date: Sep 15, 2008 08:39

Hi List,

I'm writing a simple script to download some web pages on the net.
Using LWP it's works fine, but how can I get html page with strange
characher?

For example LWP doesn't get page with "on-demand†" string
(on-demand%%C3%%A2%%C2%%80%%C2%%9D).

I tried in vain with Encode, HTTP::Response::Encoding.

if ($res->is_success){
my $html_content = $res->content;
#my $html_content = $res->decoded_content;
#my $html_content = encode( 'utf8', $res->decoded_content );
my $html_content = uri_escape_utf8($res->content);
...
...

LWP returns half pages, untile it finds a strange character.

Any suggestions?

S.T.

p.s. I read "jerakeen.org/files/2005/perl-utf8.slides.pdf" documentation but
I've some problem again... :)
1 Comment
  restrict download bandwidth         


Author: Jan Buchholz
Date: Sep 12, 2008 03:48

Hallo,

does anybody know, i could restrict the download bandwidth with LWP::UserAgent.

THX

--
Jan Buchholz
1 Comment
  [PATCH] FIFO header order support in HTTP::Headers         


Author: Michael Greb
Date: Sep 4, 2008 12:35

Greetings,

We are currently using HTTP::Daemon to prototype a project and have a
need to access headers in the order they were sent over the network.
Our particular use case is cryptographically signing a subset of the
headers and sending this signature as an additional header.

A specified set of headers are to be included in the signature if
present in the request. We join the content of these headers (with
"\n") then calculate the expected signature and compare it to the
value submitted by the client. In order to get the same signature, we
must join the header content in the same order as the client. If we
only needed to support perl clients using LWP::UserAgent, this
wouldn't be an issue as HTTP::Daemon and LWP::UserAgent both use
HTTP::Headers and the order the headers will be presented to the
consuming script is predictable. Unfortunately, we must support
multiple languages.
Show full article (3.20Kb)
7 Comments
  Re: HTTP/Cookies.pm bug?         


Author: Gisle Aas
Date: Sep 4, 2008 01:21

On Sep 4, 2008, at 07:06, Russ Schnapp wrote:
> I'm not sure whether you're still interested in this, but I think
> I've come across a bug in HTTP::Cookies. If you're not the right
> person for me to handle this, please let me know who is.

I'm the right one; but it's usually best to send requests like this
to the libwww mailing list. Cc:-ed.
> The problem is in add_cookie_header. If the cookie version is
> nonzero and the cookie contents include a non-alpha (\W) character,
> it escapes any quotes or slashes in the cookie value.

Why do you specify an nonzero version number without using the Set-
Cookie2 header?

I'm thinking that the right fix for this might be to just force
'version=0' for any cookie set with 'Set-Cookie'. This patch achieve
that:
Show full article (2.27Kb)
no comments
  Effect of Namespaces in XHTML on Headers         


Author: Phil Archer
Date: Sep 1, 2008 02:15

Hi,

I've used LWP in several apps in which the key bit of information I'm
after is the headers. I've therefore got used to the fact that if the
returned resource is HTML, one of the triggers for "OK, that's all the
headers and everything else must be content" is the presence of anything
in the section of the document that LWP doesn't recognise.

Take this, for example:

"http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
xmlns:creativeCommons='http://backend.userland.com/creativeCommonsRssModule'
xmlns="http://www.w3.org/1999/xhtml" dir="ltr" lang="en-US">

http://creativecommons.org/licenses/by-nc-nd/3.0/

http://gmpg.org/xfn/11">
...

Perfectly valid XHTML - but... LWP doesn't recognise the

The User Agent package I'm using is version 2.31

So, some questions:
Show full article (2.02Kb)
1 Comment
1 2 3 4 5 6 7 8 9