Created at 1pm, Apr 6
ProactiveMedia
0
WebSite Quality and Accessibility
YoGTL1fPh4bO1mZbYJAj2Ui26-8qmc0iuGraGKjP2mY
File Type
DOCX
Entry Count
68
Embed. Model
jina_embeddings_v2_base_en
Index Type
hnsw

One of the Web’s shortcomings as a publishing medium is its lack of a formal editorial process. Anyone can publish anything without external checking. This is excellent for freedom of speech and for universal access to the medium, but it is very limiting in terms of quality control. A quick search on AltaVista for a simple typing error such as ‘univeristy’ comes up with over 80,000 pages containing the error—and the vast majority of these pages are official institutional pages.Many sites also have sloppy HTML coding, with broken images, links that don’t work, browser incompatibilities and other errors and omissions that demonstrate a poor approach to quality control on the part of the web team—often caused by lack of resources. A web site should reflect favourably on your organisation, deliver a message and present a positive image. Few things undermine customer confidence more effectively than sloppy workmanship.Establishing and maintaining the technical quality of a web site involves at least five steps:• Validating the HTML• Testing the page• Proof-reading the content • Testing the instructions• Testing the back end.

During the meeting the RNIB people sat there describing many detailed and specific HTML problems, while the BBC people, myself included, nodded awkwardly, avoided eye contact, and generally shifted around uncomfortably in our chairs. However, towards the end of the meeting I realised that many of the problems could be fixed automatically by a CGI filter script, so I asked if anyone had done this before. When I heard that the answer was no I went off and had a go at doing it myself, and by the following weekend I had a working first draft of what was to become Betsie. Despite the best efforts of the W3C (World Wide Web Consortium), the vast majority of web sites out there still have major accessibility problems. These problems can be divided into two categories: those that are the result of poor editorial strategies, and those that are the result of poor coding strategies.
id: 83a0d4e8848a6233cb5bc863b92afc30 - page: 8
Problems in the first categorysuch as poorly written (or unwritten) alt text for imagescannot easily be fixed by software. Although interesting experiments in this area have been carried out, the fact is that human intervention is required to solve the vast majority of editorial problems. Problems in the second category, however, can often be olved using softwarealthough it does depend on just how poor the code involved is. One example of this type of problem is the prevalent usage of columnar design in web pages, as if the Web were some kind of DTP-like medium. Such designs are typically handled badly by the access software used by the blind and visually impaired community.
id: 79a49d38937d11cc68a2bda13e5f4b3c - page: 8
Using a text-only browser or a decolumnising browser does not necessarily help much, since the same long, left-hand navigation list tends to appear at the top of every page, which makes it hard to know whether or not a new page is in fact the one that you wanted. A serverside filter transform tool can be used instead, to ensure that when a page is decolumnised the standard navigation elements appear at the bottom instead of at the top, resulting in an on-the-fly text-only version of the site that is actually navigable by users. The alert reader will already have guessed that Betsie is, in fact, just such a tool.
id: f3a28450a40ae39ac8e288efeb0c40a4 - page: 8
Issues, Solutions and Outcomes Once I had my list of specific issues raised by the RNIB, and I had figured out the basics of implementing a CGI filter script in Perl, all that remained was to work out which of those issues could be solved using Perl and which would require direct human intervention. It turned out that decolumnisation was a fairly simple matter of identifying the standard left-hand navigation bar (on the BBC site), moving it to the bottom of the page, and then removing all table tags. This produced a result not unlike that of decolumnising text-only browsers such as Lynx, but that was more practically useful. However, if a page had been designed such that the tables in it did not degrade gracefully there was little that Betsie could do about the resulting mess.
id: fa4deb1edef1f095ef32c55f4e049d22 - page: 8
How to Retrieve?
# Search

curl -X POST "https://search.dria.co/hnsw/search" \
-H "x-api-key: <YOUR_API_KEY>" \
-H "Content-Type: application/json" \
-d '{"rerank": true, "top_n": 10, "contract_id": "YoGTL1fPh4bO1mZbYJAj2Ui26-8qmc0iuGraGKjP2mY", "query": "What is alexanDRIA library?"}'
        
# Query

curl -X POST "https://search.dria.co/hnsw/query" \
-H "x-api-key: <YOUR_API_KEY>" \
-H "Content-Type: application/json" \
-d '{"vector": [0.123, 0.5236], "top_n": 10, "contract_id": "YoGTL1fPh4bO1mZbYJAj2Ui26-8qmc0iuGraGKjP2mY", "level": 2}'